Thursday, 18 December 2008

"Command line error." when installing Web Part into WSS 3.0/MOSS 2007 with stsadm.exe

This is a bizarre problem - but if you get the stsadm.exe generic "command line error." whilst trying to install SharePoint web parts and your paths look fine, it may just be an encoding issue when you copy the commandline arguments between different apps. You do NOT need to have your wsp file in the same directory as stsadm.exe when installing parts to your site. You see, different apps interpret hyphens differently. If you copy a hyphen from a web site, it may just be a unicode representation of a hyphen and not a "real" hyphen. For more detail, see:

http://weblogs.asp.net/soever/archive/2007/12/22/sharepoint-stsadm-exe-and-the-infamous-quot-command-line-error-quot.aspx
and
http://www.celestialsoftware.net/support/forums?ubb=get_topic%3bf=1%3bt=000048

A simple solution is just to make sure you type all your stsadm.exe command parameters in manually and not copy and paste them into your DOS prompt.

Friday, 12 December 2008

Telerik Releases "Open Access", a Database Agnostic ORM Product that Works with LINQ

My favourite 3rd party WebUI control provider Telrik just released its new ORM product called "Open Access" - http://www.telerik.com/products/orm.aspx. I didn't realise they were developing such a thing - but it turns out they just acquired German company Vanatec that specializes in ORM products. They have performed a few updates to the original Vanatec software since they acquired it (such as removing a dependency on J#) - so they have grabbed this product and are running with it full steam. If the quality of their controls is anything to go by, this could be a valuable asset in any .NET developer's toolbelt.

I'm going to try it out and evaluate it against some of the custom LINQ, LINQ to SQL and Nhibernate-based efforts that I've created and worked with on previous projects. It also supports non-SQL Server databases such as Oracle. Now there is also a fledgling Codeplex project called LINQ to Oracle http://www.codeplex.com/LinqToOracle) but this ORM product could shoot it out of the water. Telerik Open Access also supports direct SQL. I'll do a a review before the end of the year.

Wednesday, 3 December 2008

How can I set the Modified By, Created By, Date Modified, Date Updated fields via the MOSS object model? (without making an new version)

There are a few problems with the MOSS object model when adding new files using the SPFileCollection.Add() method. In particular, there is no overload that accepts both the "bool overwrite" parameter AND the details of the user who did the update at the same time.



Consequently, the upload of a file to a versioned list in SharePoint requires that you separately add the file with overwrite on and then update the User and Time stamps at a later stage.

Unfortuntately, the "Author" and "Editor" fields accessible via the SPFile Object are read-only. You can however take advantage of the UpdateOverwriteVersion(); available on list items to update these stamps manually. See the code below:



//The authenticating user needs to be service account as it uses database access,
// so we must pass in current user as parameter when adding file.
SPUser updatingUser = EnsureUser(HttpContext.Current.User.Identity.Name, web);
currentFile = fileCollection.Add(newDocument.Name, contents, fileProperties, addAsNewVersionToExistingFiles);

//Get list item from SPFile object
SPListItem listItem = currentFile.Item;

//Overwrite with correct values as the object model doesn't allow us
// to both specify overwrite=true and the specific user names.
listItem["Author"] = updatingUser;
listItem["Editor"] = updatingUser;
listItem.UpdateOverwriteVersion();


How do I handle or abort Function Key events (e.g. F1, F2,etc) in both IE and Firefox?

Run this page and you will be shown the keycode for the Function Key you pressed. In addition, any standard browser handlers (such as help prompts when F1 is pressed or Searches when F3 is pressed) will be aborted - so you can pass them to your app instead. See below:


<script type="text/javascript" language="javascript">
/////////////////////////////////////////////////////////////////////
///Demo Script to display the function key that was pressed
///and abort any browser event e.g. F3 for IE find,
///F1 for IE help, F1 for Firefox Help
/////////////////////////////////////////////////////////////////////
///Version Author Date Comment
///1.0 DDK 03 Dec 2008 Original Version for
/// Application Tender
/// Proof of concept
/// when users wanted 'green
/// screen' functionality
/////////////////////////////////////////////////////////////////////
//debugger;
document.onkeydown = showDownAndAbortEvent;
//Stop F1 opening Help in IE
document.onhelp=function() {return false};
window.onhelp=function() {return false};

// decipher key down codes
function showDownAndAbortEvent(evt)
{
//clearCells();
evt = (evt) ? evt : ((event) ? event : null);
if (evt) {
alert('keyCode:' + evt.keyCode + ';charcode' + evt.charCode + ';' ) ;
try
{
evt.preventDefault(); // disable default help in Firefox
evt.stopPropagation();
}
catch (ex) {}
try
{
//Kill any intercepts for ie
window.event.cancelBubble = true;
window.event.returnValue = false;
window.event.keyCode = 0;
}
catch (ex) {}
return false;
}
}
</script>

Tuesday, 25 November 2008

Aspose.Words Ignores Protection Exclusion for Selected Areas when Generating Word Documents

As per the following thread, http://www.aspose.com/community/forums/showthread.aspx?PostID=153354&Subj=document-protection#153354, Aspose.Words will completely ignore your selection-based exclusions when you are trying to lock/protect parts of a document - and there is no fix expected for the next 3-6 months. This is unfortunate as it is a very powerful feature when generating MS Word documents that should only be partially editable.

It is not a viable option to use Form fields and section-based security as it only allows data entry without the ability to change any formatting (this was the workaround suggested by Aspose for now). My client has reluctantly accepted that this is an inherent limitation of the document generation section in our ASP.NET application.

One interesting element in Aspose.Word's implementation is their use of tags to delimit repeating data within table. You pass in a DataSet and it will use the field names (presumably via reflection) to render the fields of your list into a Word table.

See below for an example of the table-binding syntax of Aspose.Word. Note the «TableStart:TableName»«Field1»«Field2»«TableEnd:TableName» syntax.


Saturday, 22 November 2008

Dell D630 Problem - Cannot enable LAN Card - Fixed

I normally connect my laptop to my home network via wireless on a Belkin Router (F1PI241EGau) from iiNet. Tonight, however, I tried to connect via a LAN port because it appears my router's wireless capability has been freezing every day with Smurf attacks - and I wanted to see the log before hard rebooting it. Rebooting has the unwanted effect of clearing out any diagnostic messages I might get right before the failure of wireless.

Problem was - I just couldn't enable the LAN adapter on my laptop - even through device manager. When I tried to enable it it would respond with a shocked "Connection Failed!" or "Windows could not enable your device" - and would show "Device is disabled (Code 22)". Nothing at all was showing up in the event log under Application/System categories - so the event log was about as useful as a screen door on a submarine in this situation.

I tried to disable some the Dell Quickset (f)utility and disabled power management to try to fix the problem. Dell has a lovely (for some) service that disables the network card when you are not on mains power called "NICCONFIGSVC" (C:\Program Files\Dell\QuickSet\NICCONFIGSVC.exe) - but still could not enable my laptop LAN card.

The simple fix was to uninstall the problematic network card and restarted. I now have a brand spanking new Local Area Network Adapter 3 which is fully functional and I can now proceed to diagnose the nasty Smurf DoS attacks.

Tuesday, 4 November 2008

List of Alternative Options for Deploying or Hosting Applications to SharePoint 2007 or WSS 3.0

There are several different ways you can deploy an application to MOSS:
  1. Create Custom built Web Parts and deploy them as features
  2. Drop an ASP.NET application in c:\program files\common files\microsoft shared\web server extensions\12\template\layouts that gets virtualized to every site on your server
    e.g. to http://servername/sites/sitename/_layouts/MyApp/SomePage.aspx
  3. Use User Controls and the Son of SmartPart (See http://www.smartpart.info/default.aspx)
  4. Using ASPX pages added to SharePoint Site (this involves deploying to the bin folder of your SharePoint site such as C:\Inetpub\wwwroot\wss\VirtualDirectories\moss.litwareinc.com80\bin, Adding SafeControls Web.Config entries to allow your assemblies to load, and deploying your aspx to the relevant site.

For more info see

http://blogs.msdn.com/cjohnson/archive/2006/09/05/application-development-on-moss-2007-amp-wss-v3.aspx. It also has a very handy decision matrix when you are having trouble deciding which deployment model/architecture you should use.

There is also a list of 2 other options described here:

http://sharenotes.wordpress.com/2008/02/21/add-custom-aspx-pages-or-asp-net-pages-in-sharepoint/

  1. Using features and WSP package: Following some steps as recommended by Andrew Connell (MOSS MVP). Here is the blog. I believe this is the standard approach users are using in the SharePoint developer community.
  2. Using VSeWSS (Visual Studio extensions for WSS): This is yet another and latest solution. Microsoft recently released the VSeWSS 1.1 RTM. Using this, we can deploy all the asp .net pages into SharePoint by setting up a new project in Visual Studio. VSeWSS creates a solution package using features. Setup the project and hit ‘Deploy’ and it is done.

Monday, 3 November 2008

Using Attributes, Reflection and GetCustomAttributes() to support field mappings between systems

My task today was to generate MS word documents using Aspose.Word and merge fields. The problem was, the names of the merge fields did not match exactly with the names of the properties of my objects - and I didn't want to map all fields in my objects to the external components in a separate method call. I approached this with a mechanism I have used several times before - using CustomAttributes and Reflection to drive the property mappings between application entities and external components such as Documents or Emails. See below for details on this. Note the use of the GetCustomAttributes() for retrieving custom attributes on your properties.

The attribute class:



/// <summary>
/// Supports the mapping between Aspose.Word Merge fields and the Fields
/// in the Data Transfer Object
/// </summary>
[AttributeUsage(AttributeTargets.All)]
public class DocumentMergeFieldMappingAttribute : System.Attribute
{
public string MergeFieldName { get; set; }

public DocumentMergeFieldMappingAttribute(string mergeFieldName)
{
MergeFieldName = mergeFieldName;
}
}


The mapping class which builds up a Dictionary (name, value pairs)


/// <summary>
/// Gets dictionary of field values for merge into documents - uses the
/// DocumentMergeFieldMappingAttribute to determine which dto properties
/// should render to the Document itself.
/// </summary>
/// <param name="inputDto"></param>
/// <returns></returns>
private IDictionary<string, IFieldValueDto> GetDtoFieldValues(object inputDto)
{
Dictionary<string, IFieldValueDto> dictionary = new Dictionary<string, IFieldValueDto>();

Type objectType = inputDto.GetType();
PropertyInfo[] properties = objectType.GetProperties();

foreach(PropertyInfo property in properties)
{
foreach (Attribute attribute in property.GetCustomAttributes(true))
{
if (attribute is DocumentMergeFieldMappingAttribute)
{
string fieldValue = property.GetValue(inputDto, null) as string ?? string.Empty;
//Set property value and the mapped document field name
dictionary.Add(((DocumentMergeFieldMappingAttribute)attribute)
.MergeFieldName, new FieldValueDto<string>(fieldValue));
}
}
}
return dictionary;
}

Thursday, 30 October 2008

Uploading Files to MOSS 2007 via the SharePoint Object Model

I've blogged on a similar topic about a year ago, but the previous post was using the Copy.asmx web service directly (see http://ddkonline.blogspot.com/2008/01/uploading-files-to-moss-2007-via.html). Here is the equivalent using the SharePoint object model when you have the luxury of being hosted on the WSS/SharePoint box. One thing to note is that you will chew up memory if you don't dispose your SPWeb and SPSite objects correctly (either via using or by doing explicit Dispose() calls) - you cannot rely on .NET Framework garbage collection to clean them up for you. See inline comments below on this.

Background: The current Management App I'm working on has an IIS application hosted in Atlanta as the frontend to a Windows SharePoint Services (WSS) document store hosted in Australia. The problem with this was that the upload would do a massive round-trip overseas and back again - so a large file (say 20MB would take well over an hour). The workaround to this was to host the page as an IFrame in our application and to host it on a WSS Server so uploads of large files could be done directly to WSS rather than via the IIS Server overseas. I used the following code in the hosted page to perform the uploads using the SharePoint 2007 object model.

/// <summary>
/// Upload documents using MOSS Object Model Directly, updates database with document information
/// After the upload of each document is complete.
/// </summary>
/// <param name="projectId"></param>
/// <param name="sourceStream"></param>
/// <param name="fileName"></param>
/// <returns></returns>
public static VoidResponse UploadDocument(UploadedFile uploadedFile,
string projectNumber, int projectId, int activityId, int documentTypeLookupId,
int constructionAuthorisationId, string subfolderPath)
{
//Core document information to be passed into MOSS
//and into the .NET Web Service to update the database
DocumentDto newDocument = new DocumentDto();
newDocument.ProjectNumber = projectNumber;
newDocument.Name = uploadedFile.GetName();
newDocument.SharepointListName = ConstructSharepointListName(newDocument.ProjectNumber);
//Snapshot path as combination of both the
newDocument.SharepointSnapshotPath = CurrentSnapshotPath;

if (string.IsNullOrEmpty(newDocument.SharepointFileRelativePath))
{
newDocument.SharepointFileRelativePath = newDocument.Name;
}

string queryOptions = FormatHelper.GetQueryOptions(projectNumber, CurrentSnapshotPath, subfolderPath);

//To Allow return of uploaded file details
VoidResponse response = new VoidResponse();

//Housing SPSite and SPWeb in using blocks so they are disposed correctly as per best practices
//in http://msdn.microsoft.com/en-us/library/aa973248.aspx
using (SPSite site = new SPSite(SharePointParameter.SharePointSiteUrl))
{
using (SPWeb web = site.OpenWeb()) //The SPSite already has the site url value.
{
try
{
web.AllowUnsafeUpdates = true; //Using AllowUnsafeUpdates because we are creating an SPWeb -
//otherwise will result in exception
//as per http://hristopavlov.wordpress.com/2008/05/16/what-you-need-to-know-about-allowunsafeupdates/
SPList list = web.Lists[newDocument.SharepointListName];

SPFileCollection fileCollection = null;

//Using query options as recommended by
//http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.splist.aspx,
//same as the Web Services do in the LL framework.
SPQuery spQuery = new SPQuery();
spQuery.Query = queryOptions;
SPListItemCollection folderCollection = list.GetItems(spQuery);

//Folders are really just SPListItems with type of FilSPFileSystemObjectType.Folder
//Defensive: Folder not found exception
if (folderCollection.Count == 0) //Error condition - folder not found
throw new ConfigurationErrorsException(string.Format(Messages.Pcp_MOSS_FolderNotFoundError,
newDocument.SharepointSnapshotPath));

fileCollection = folderCollection[0].Folder.Files;

//Only perform duplicate check when it is turned on in configuration settings.
if (SharePointParameter.CheckForDuplicateFilesInMOSS)
{
//Check if file doesn't already exist.
foreach (SPFile file in fileCollection)
{
if (file.Name == newDocument.Name)
{
//Duplicate detected
response.IsSuccessful = false;
response.Messages.Add(new Message(string.Format(Messages.Pcp_DuplicateFileDetected,
file.Name)));
return response;
}
}
}

Stream fileStream = uploadedFile.InputStream;
byte[] contents = new byte[fileStream.Length];
fileStream.Read(contents, 0, (int)fileStream.Length);
fileStream.Close();

//Upload document itself. This will also give us the new Id to store in the database.
SPFile currentFile = null;

currentFile = fileCollection.Add(newDocument.Name, contents);
System.Security.Principal.WindowsImpersonationContext impersonationContext = null;

try
{
if (GetUseCurrentUserCredentialsForMOSSFlag()) //Impersonate if uploading document
{
impersonationContext =
((System.Security.Principal.WindowsIdentity)
HttpContext.Current.User.Identity).Impersonate();
}

//Upload file details to main PCPData database
PcpDataService.PcpDataService pcpDataService = new PcpDataService.PcpDataService();
pcpDataService.Url = DDKOnline.WssHostedWeb.Shared.SystemParameter.DDKOnlineDataServiceUrl;
pcpDataService.AddProjectDocument(projectId, activityId, constructionAuthorisationId,
currentFile.Item.ID.ToString(), newDocument.SharepointListName,
newDocument.SharepointSnapshotPath, newDocument.SharepointFileRelativePath,
documentTypeLookupId, newDocument.Name);
}
finally
{
if (GetUseCurrentUserCredentialsForMOSSFlag() && impersonationContext != null)
{
impersonationContext.Undo();
}
}
//Successful upload completed.
response.IsSuccessful = true;
response.Messages.Add(new Message(Messages.Pcp_SharePointUploadComplete));
}
catch (SPException spException)
{
//Error ocurred during upload
response.Errors.Add(new DDKOnline.Framework.Common.Response.Error(spException.Message));

response.IsSuccessful = false;
}
catch (Exception ex)
{
response.Errors.Add(
new Error(string.Format(Messages.Pcp_SharePointAccessFailed)));
ExceptionPolicy.HandleException(ex, SERVICE_EXCEPTION_POLICY);
response.IsSuccessful = false;
}
finally
{
if (web != null) web.AllowUnsafeUpdates = false;
}
}
}
return response;
}


/// <summary>
/// Generate query options to support storage within subfolders of current list
/// </summary>
public class FormatHelper
{
internal static string GetQueryOptions(string projectNumber, string CurrentSnapshotPath, string subFolderPath)
{
string folderPath = VirtualPathUtility.RemoveTrailingSlash(string.Format("{0}/{1}/{2}", projectNumber, CurrentSnapshotPath, subFolderPath));
string queryOptions = string.Format("<Query/><QueryOptions><Folder>{0}</Folder></QueryOptions>", folderPath);
return queryOptions;
}
}

Tuesday, 28 October 2008

How do I get the underlying type of a Generic List?

The Type.GetGenericArguments() method will let you determine the underlying type of the elements of a list object at runtime. See this article for more information.

http://msdn.microsoft.com/en-us/library/system.type.getgenericarguments.aspx

For example:


if (t.IsGenericType)
{
// If this is a generic type, display the type arguments.
//
Type[] typeArguments = t.GetGenericArguments();
Console.WriteLine("\tList type arguments ({0}):", typeArguments.Length);
    foreach (Type tParam in typeArguments)
{
// If this is a type parameter, display its
// position.
//

        if (tParam.IsGenericParameter)
{
Console.WriteLine("\t\t{0}\t(unassigned - parameter position {1})",tParam,
tParam.GenericParameterPosition);
}
else
{
Console.WriteLine("\t\t{0}", tParam);
}
}
}

Wednesday, 15 October 2008

Zach William Klein arrives!

Our fabulous new baby boy Zach William Klein ("Zacky Zack") was born 10/10/2008 10:19am with a championship weigh-in of 3.855kg. We have just brought him home from hospital and after a shocker of a night, he has calmed down to 3 sleeps a night on day 2. Welcome home Zach!

For many more first piccies see http://www.ddkonline.com/Gallery/2008_10_October_ZachKlein_Day1/gallery.html or my most recent FaceBook photo gallery.










Tuesday, 7 October 2008

How to do remote debugging with the Visual Studio 2008 Remote Debugger Service (Msvsmon.exe)

There is always a problem that will crop up on one of your servers that you just CANNOT reproduce at all. To solve pesky problems like this, you can make use of the remote debugger service of Visual Studio. What's more, you can debug without running a setup package at all on dev or production servers. Instead, you can just run msvsmon.exe from a file share without even installing anything. This will definitely keep the network guys happy!

Typically, you can get msvsmon.exe file from the following path on your development machines that already have Visual Studio 2008 installed:
C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\Remote Debugger\x86\msvsmon.exe

When you run Msvsmon.exe, it just shows up on the server as a windows application on your server (See screenshot). You can also run it as a Windows Service so you don't need to log onto the server to start it up.




In Visual Studio on your local machine, you then just put in a fully qualified remote server name in "Attach to Process" dialog. Now you can debug the remote machine to your hearts content (make sure you're a debugger user or remote admin otherwise you will get access denied errors).

Just a reminder - if you are debugging ASP.NET on a 2003 server and above, there will be 1 w3wp.exe process for each of your application pools. It may be hard to find out which w3wp.exe process you want to debug as there may be many application pools. You can find out which w3wp.exe process is running your pool by a process of deduction (ie just by stopping the application pools you are not using. The one left is yours).

For the full goss, see http://support.microsoft.com/default.aspx/kb/910448. There is even a 17 minute tutorial on setting it up. See http://www.microsoft.com/uk/msdn/screencasts/screencast/313/visual-studio-2008-remote-debugging-with-msvsmonexe.aspx

Just keep in mind - you may be up for a long wait if the connection to your server is slow - especially if attaching a process in Atlanta! :o). There is a timeout option if you get timeout errors (you only get timeout errors in "No Authentication" mode as the "Windows Authentication" mode has an infinite timeout.)


System.Web.VirtualPathUtility to get File Names from Urls and Converting Relative Paths to Absolute Urls (without ResolveUrl)

One of the things that appeared to be lacking in ASP.NET 2.0 and above (or so I thought) is a utility library to parse and extract paths and get the file name from a Url. Here are some examples of the url parsing code that I and others have done without knowledge of this library: http://www.thejackol.com/2007/04/10/get-file-name-from-url-cnet/
and http://www.west-wind.com/Weblog/posts/154812.aspx

One class that does just this (and which most developers don't know about) is System.Web.VirtualPathUtility(). Everyone knows it's brother utility library System.Web.HttpUtility but VirtualPathUtility should be an essential part of any ASP.NET developer's toolkit. See http://msdn.microsoft.com/en-us/library/system.web.virtualpathutility.aspx for more information.



Thursday, 4 September 2008

Microsoft Tech Ed Australia 2008 - Top Tech Ed Takeaways

The veritable tsunami of information that is Tech Ed 2008 is has been and gone. One of the things I've found about some conferences like Tech Ed is that they have a breadth of information but sometimes don't engage me or don't go into the depth of detail that I would like (apparently, PDC is meant to be more in depth). However, 5 of the presentations I went to DID keep me on the edge of my seat:

  1. ARC304 - "Building Loosely Coupled Applications with Unity" with Richard Banks
  2. DEV410 - "Debugging the world, starting with the CLR (or Debugging from the trenches)" by Corneliu on debugging (http://www.acorns.com.au/)
  3. SOA 311 - "Building Human Workflows with WF State Machines" with Kirk Evans
  4. WEB301 - "ASP.NET MVC – Should you care?", WEB302 - "ADO.NET Data Services – The Zen of RESTfulness and the Art of “Astoria”" - Scott Hanselman is truly an entertaining and informative presenter.
  5. WEB304 "Web Futures - the next 18 months"

My Top Takeaways

  1. MS wants you to utilize the "cloud" because it is getting in on the action
    Offerings like SQL Server Data services are now out in Beta so we can utilize the "infinitely scalable processing and storage" of the cloud.
  2. Get cracking with ADO.NET Entity Framework and LINQ to Entities as it solves some problems in LINQ to SQL such as doing Lazy Loading and dealing with the Database to Application Impedence Mismatch with support for inheritance in your EDMX data model. However is does NOT unfortunately solve the replay problems we have in LINQ to SQL (there are projects like http://code.msdn.microsoft.com/entitybag/ for this before Entity Framework 2.0 comes out)
  3. Use Unity for all your dependency injection needs from now on
    It is the Web Client Software Factory Object Builder gone on steriods.
  4. Try harder to use object oriented techniques with JavaScript
    The trend is towards more javascript in applications, not less (especially with some of the frameworks out there like jquery and improved design-time support) - you can simplify your code with prototypes and JavaScript inheritance.
  5. MVC is a no-go. Only consider using the MVC framework for extreme control of html output and testability. From a business viewpoint, the value benefits are minimal but the costs due to increased complexity and "plumbing work" are significant. This makes it a no-go for me. I don't believe that you should have unit tests for absolutely everything - just the core logic. Otherwise you get massively declining returns trying to increase your test coverage. In addition, the MVC Framework is not ready for prime-time (currently in Preview 5) - so don't even start any projects or frameworks that use it.
  6. Silverlight is a big question mark (for app UI development, not for streaming media apps) & you should follow the KISS principle. The Web Futures panel including presenters like Scott Hanselman came down to 2 significant truths:
    1. A complaint was made about the ongoing treadmill of new/old technologies like Xaml (Silverlight and WPF in particular). The answer from Mr Hanselman was that any technology you use now with look crap in 18 months. The golden rule to mitigate this problem is you should go with the "Simplest solution to meet the requirement, the simplest requirement to meet the need". In other words, "Complexity kills" any good project. In other words, "do the simplest possible thing that will work and refactor it tomorrow."

    2. JavaScript and Silverlight are competing models for UI development - but JavaScript has a massive upper hand in terms of industry support and object models like JQuery (http://jquery.com/). There are also no killer apps for Silverlight. I'm going to hold back on Silverlight development for non-streaming media apps for now...

Other Notes:

Some of the takeaways that I found from Tech Ed 2008 were (in no particular order):

  1. Tech Ed 2008 Keynote - This sets the tone for the rest of the conference - much the tone of the presentation was set around Loose Coupling of Software and Services (Software Plus Services (S+S)), focussing on a few key products like LiveMesh, SQL Server Data Services and the latest version of Exchange. Live Mesh was trumpeted as a peer-to-peer application platform. They also did a demo of a custom app called “Smart Asset” used by New Zealand Antarctic Base with Dr Neil Roodyn to showcase MS's Service part of the "S+S" equation mainly around Virtual Earth. They also did a demo of the web app "Street Advisor" – and what MS did to improve an existing site with these services including the VE map control, and the Web Messenger control to add immediacy to interaction with sites (you don’t need Messenger installed for this functionality).


  2. ARC 203 - Understanding Software Plus Services - Identified some of the key industry trends SOA (based on reuse and agility), SaaS – (based on flexible pricing and delivery), RIA (based on experience), and Cloud computing (based on service utility). Covered the fact that "S+S" - the Software part of S+S normally includes LoB apps. Established a grid for the tradeoffs of different architectural decisions (Control vs Economy of Scale, Buy vs Build), and a spectrum between "On-Premise" (aka On-Site), Hosted, Cloud and Vendor architectural solutions). Gave some general rules for S+S architectural deicisions - that you should typically host out any commodity services – SaaS (e.g. CRM and Email are commodity applications). In particular, a firm should spend money on key differentiators (non-commodity applications and services) and keep them in-house. Jungle Disk was given as an as an example of S+S.


  3. VB Session on VB 3.5 features - I didn't attend, but discussed with some other attendees and some of the best takeaways were around XML Literals (not a feature built into C#) - but there are ways around it with the "Paste as XML" addin.


  4. SOA 311 - Building Human Workflows with Windows Workflow Foundation State Machines - Kirk, the presenter indicated how confusing and verbose a lot of the documentation is on WF. This presentation used a windows forms application to host a windows workflow like most of the examples on the web - however, a new feature of 3.5 - "Workflow Services" was mentioned for hosting WF.

    It pointed out a few of the issues with the current iteration of WF e.g. there is no direct equivalent of a "flowchart" in WF (ie that loops back to previous decision points). Some in depth examples about the State machine workflow were given. Sequential workflows cannot return to a previous position – this is a key differentiator from State based flows. State Machines are more complicated (they are essentially a turing machine) but are very flexible in terms of flows. States can have child states (aka "Recursive States") - whereby you don't have to do the bubbling of a Cancel up yourself. An important concept in State Machines is in transitions of a State-based workflow - State e.g. Assigned has OnApproved/OnRejected transitions. WF will track possible state transitions and State history for you. A workflow has to to have an initial state and a completed state - these a properties of a workflow and point to one of the state objects on your diagram (via drop down in property explorer). To do a State transition, use a state transition object (just sets state from one to the other) . Simply set the target statename in proprties of the “SetState” Activity. The presenter did a walk through of a supply fulfillment workflow and how the SetState activity is used to to the transitions.

    You handle external events just by defining an interface and adding an [ExternalDataExchange()] attribute on class, with all events (thse show in the dropdown when the Interface type is set). Create eventargs which inherit from ExternalDataEventArgs to transfer information e.g. SupplyFulfillmentArgs – with InstanceId as Guide. Should be marked Serializable for persistent workflows. Then add a concreate Class – just implements our sample IEventService – so raises these events - E.g. RaiseAssignedEvent. In the form/the workflow hosting environment: i. Should have WorkflowRuntime declared in the form ii. Have a Guid to track the instance of the workflow you are showing/dealing with. In the form load, Add your ExternalDataExchangeService, then do wr.AddService (externalDataService)
    Then do a workflowRuntime.StartRuntime on the instantiated workflow object. With the persistence service – processes are no longer bound to processor/app domain. There are only 2 database tables with the persistence module for WF - "CompletedState" and "InstanceData"

    In the Windows form, you instantiate a SQLWorkflowPersistenceService which inherits from WorkflowPerstitenceService - so you can implement custom peristence if you need it.
    The LoadIntervalSeconds in the config file – allows you to control how often to load from the persistence store. Tracking Services – are use for monitoring reports and business process. This just serializes the BLOB.
    In tracking services, there are several different event types:
    a. Workflow Instance Eents - Completed, persisted, etc
    b. Activity Events - Closed/Executing, Initialized
    c. User Events – this.TrackData()Tracking Profile -> Tracking Service

    Tracking queries should be done through tracking service object model. To do a query you should Instantiate SqlTrackingQuery and SqlTrackingOptions (e.g. options could pass in WorkflowStatus.completed). The Result comes back as an IList and will have 1 of 3 types – workflow/activity/user events in the return list (just cast back to determine which type it is). You need the SharedWorkflowCommitWorkBatchService to allow sharing of connectionstring between Track and Persistence service.

    Samples at blogs.msdn.com/kaevans
  5. One of the chalk and talks and lunchtime were doing demos of what Silverlight can do sample sites were given -
    a) http://www.cookingwithxaml.com/meals/financials/default.html b) http://www.contosobnk.com/


  6. WEB311 - Designing Compelling Silverlight User Experiences With Expression Studio - Showed the use of the Expression suite of tools to create some of the data-driven silverlight animations used at Tech Ed itself. One of the tips was that you can't just copy and paste wireframes generated in visio or powerpoint into Expression Blend (they just show as bitmaps in Blend, not separate objects). However you can get them in easily by exporting the pdf to give it an "ai" extension - you can then drag the file to Blend and modify the objects in your wireframe separately. The presenter then showed some of the standard techniques - such as using slicing and using Windows movie maker to create basic timed transitions. The presenter also demoed use of overlays and converting slices to user controls - and the use of ease in/ease out. Some parts of the demo stopped working - but this was due to the inadequate visual indication that timeline recording is on or off in Blend. One important note from the presentation is that sometimes you cannot use silverlight canvases for more advanced functionality - in particular, Silverlight canvases don't support triggers - but WPF canvases do.


  7. DEV410 - "Debugging the world, starting with the CLR (or Debugging from the trenches)" - this presentation focussed on making debugging easier and the innards of debugging using windbg. One of the cool things was using a memory dump from another machine to extract the loaded dlls and stack trace and work out what happened on the production server.

    After discussing the use of the .NET Debugger Attributes, Cornlieu showed how the "Add ObjectId" in the debugger context menu in Visual Studio allows you to keep an object in scope even if the debugger falls out of scope.

    After a rundown on using windbg with .NET, we were shown a GUI tool that helps you to go through memory dump files without commandline tools such as adplus.exe - http://www.thinktecture.com/SOSAssist


  8. DEV380 - CRM Integration with External Applications
    Covered some of the basics of integrating CRM with other applications via SQL Filtered Views, .NET plugins and Web Services. Web Services are created for each CRM entity created. CRM was discussed as much more than a CRM product - a complete application platform based on meta data – similar to a 4GL language – CRM handles things like schema behind the scenes.


  9. WEB309 - Silverlight for Developers
    Went through doing Ioc and Testing operations in Silverlight 2 - http://jonas.follesoe.no/
    An interesting part was removing codebehind completely by using the ObservableCollection


  10. WEB301 - ASP.NET MVC – Should you care? - with Scott Hanselman. This presentation convinced me that MVC has it's place, but not for most projects. It is NOT ASP.NET web forms 4.0. It really just gives you more testability and control over the output Html. It also gives you more control over the format of the Url - something you can do with System.Web.Routing if you want to since it was pulled out of System.Web.Mvc.Routing and into System.Web.Routing in 3.5 SP1.

    Scott also outlined how the MVC framework basically has some hardcoded folders - /Models, /Views and /Controllers. He also went through the call stack for Hello world sample app to show differences between standard webforms model and how requests are processed. It is really just another http handler -"MvcHandler". Also notes was that views are searched for in a specific order by System.Web.Mvc - it looks for aspx, then ascx etc. There is no viewstate or postbacks as it doesn't use the normal http handler. One killer issue is that there are no ASP.NET MVC controls yet.

    MVC is hooked in by modifying the global.asax Application_Start, ad a call that does a RouteTable.Routes.MapRoute().

    For the purposes of testing, you can use Mock tools such as Moq/RhinoMocks/TypeMock.

    MVC requests all return a particular result - such as ActionResult, ViewResult, JsonResult, DownloadResult.

    Scott also used a tool called HttpWatch Pro which is similar to YSlow in Firefox.
  11. ARC304 - "Building Loosely Coupled Applications with Unity"
    Richard Banks explains how the Microsoft IoC/Dependency Injection framework called "Unity" makes interface-based programmin easier.

    He outlined some of the real benefits of using interfaces that makes your code much more modular and interchangeable with minimal effort. In a tightly coupled application, you typically have concrete classes within other classes and statics. These direct dependencies mean that it is very hard to swap different functionality into your code. Loosely coupled applications use Facades and Interfaces.

    The main difference between Unity and the Object builder in the Web Client Software Factory from MS patterns and practices is that it is more configuration based - ie all the dependencies of your application can be defined in your web.config which allows you to simply swap in different functionality. e.g. if you have an ISaleLineService, you define what concrete class should be instantiated in place of the interface in the config file.

    e.g. you would call ISaleLineService salesLineService = ServiceLocator.Resolve … calls Container.Resolve which finds the Default mapped concrete class as per the config file that implements the ISaleLineService interface (e.g. ConfigurationManager.GetSection()).

    Some demos on the web use Attribute and Programmatic-based dependencies - but this loses some of the configurable swap-in/out benefits of defining dependencies in your config file.

    Unit testing becomes simpler with Dependency injection. There are also some cool things that can be used like Object per session/request - see UnityContrib project for details on this.

    Unity doesn’t do Aspect Oriented Programming yet – but this is coming in version 1.2.
  12. ARC401 – Aspect Oriented Architecture meets SOA with WCF and Ent. Library

    Aspect oriented technology is about – separating cross cutting concerns from the core business logic. Enabling technologies for this include the new Enterprise Library 4.0 Policy Injection App. Block, PostSharp and the Aspect# language.

    Examples of cross cutting concerns that can be applied with aspects include Validation, Exception Mgt, Caching, Logging and Authentication/Authorization.

    Some parts of this presentation were similar to the Unity presentation I mentioned above - Similar to Unity presentation – covering policy injection e.g. the calls to PolicyInjectionCreate();

    Even with version 4.0 of the Enterprise Library, it can affect performance significantly in some situations – need to test.

    There was also the demo of the usage of the Validation Application Block for version 4.0 - which supports declarative validation logic for your applications - adding the [ValidationBehaviour] attribute on class and adding a RangeValidator on a parameter to a method call will allow you to raise a validation exception.

    You can set up Policy Injection with the Enterprise Configuration Tool (which is a VS 2008 add-in as well now - you don't have to go to a separate tool at all).
  13. WEB302 ADO.NET Data Services – The Zen of RESTfulness and the Art of “Astoria” - In this presentation, Scott Hanselman shows the experimental data access technologies that allow access to data stores over an intranet or the web via Urls (aka REST). More details are at http://astoria.mslivelabs.com/. ATOM (the schema based version of RSS) underlies the structure of data services. When installed, you create an "ADO.NET data service item" from the VS 2008 project types.

    You can basically query information in a datasource via Urls -
    /Northwind.svc/Products(1)/Category – can navigate through information very easily. This is useful for rich internet applications.

    With LINQ to REST, you can then query a remote database - the LINQ expression tree evaluator converts to url rather than SQL. To me the Astoria architecture is not great – as this is encouraging you to put your logic and other code behind the LINQ layer.e.g. so can be consumed by a SmartClient app directly. It is however, an awesome way of wrapping other dbs such as AS400 in a web-consumable form.

    Using the [WebGet] attribute you can even define the Astoria equivalent of stored procedures on the data.
  14. DEV325 LINQ to SQL vs LINQ to Enity Framework
    Discussed the use of the Edmx file, Navigation Properties with Many to Many Relationships, and some of the first-hit performance issues with LINQ to Entities. Also showed some of the differences with LINQ to Entities (e.g. .AddDays won't work with this provider as it is not really going to SQL Server).
  15. WEB315 Object Oriented MS AJAX
    Ran through some ways of improving the typical procedural-based approaches to JavaScript taken in most application development. JavaScript has many object oriented features but it's not perfect. e.g. there are no real private properties in JavaScript. Instead – use get_Property() convention.

    You can use inheritance, prototypes, interfaces, events and delegates in JavaScript. Prototypes are similar to extension methods in .NET. For 0bject inheritance – need to do a registernamesspace(MyNs) to get them recognised by the javscript runtime. You'd typically declare interfaces like "MyNamespace.INameOfInterface = function{}"
    You must implement interfaces on registerclass call.

    Just like in C#, delegates are just a method pointer that you can pass around - such as passing in a delegate to a common method where the delegate does addition/subtraction. Also demoed raising events using This._events = new Sys.EventHandler list and This.raiseEvent().

    Samples from the demos can be found at www.Codeplex.com/ScottCateAjax.
  16. WEB304 – Web Futures – the next 18 months. Was a panel that dealt with some of the hot questions about what is coming up in the forseeable development future. It involved discussions of topics such as:

    1. The battle over where your data lives, and where processing is done. Browsers as an operating system (e.g. Google Chrome). Competing viewpoints of Google (in the cloud) vs Microsoft (S+S)
    2. The battle of standard browsers vs plugins like Silverlight.
    3. The web and whether it should work offline
    4. Scott talked about JavaScript and its popularity going through the roof, especially with technologies like Bubbles, Prism and Google Gears and other supporting frameworks such as http://developer.yahoo.com/yui/
    5. The push of consumer focuses technologies – into enterprises.
    6. JavaScript as the Intermediate Language (IL) of the web
    7. If starting something now, how do you hedge your bets? For any technology decision: Complexity kills, Do the simplest possible thing that will work and refactor it tomorrow, use the minimal amount of technology and minimal requirements to meed a need.
  17. DEV420 Hardcore LINQ to Entities
    Wasn't really hardcore - but was more of a nuts and bolts of LINQ to Entities, attribute mappings and inheritance. You can have conditional inheritance in entities (via "Where" conditions). Complex types are not supported in the designer - but you can update the xml directly to do it e.g. Common Address entity used by other classes. The designer has "regenerate from database" functionality built-in. Under the hood, there are 3 files that support LINQ to entities - the CSDL file (conceptual), the MDL file (the mapping file for field to field mappings and support mappings to more friendly names), and the Schema file.

    You cannot use normal methods such as AddDays() on LINQ to entities as not supported by provider. LINQ to Entities doesn't support lazy loading - you need to use the .Include syntax.

    You can query via generics such as in Context.Products.OfType(). A suggestion was made by Adam Cogan to embed the 3 metatdata files into the project - but this would cause problems when doing deployments to different environments as you cannot just update the schema file to point to the correct database.
  18. SOA209 The Road to "Oslo" : The Microsoft Services and Modeling Platform
    This product isn't even available for demo yet (not even alpha) - but it is a new environment for keeping track of exactly what is deployed (in terms of applications, hardware etc) and where in your organization. It is still 12-18 months away. It is both a repository of company assets and even a deployment mechanism that allows the build up (e.g. deploy applications and open ports) and pull down of complete environments. It will have the "Oslo Visual Designer" component which even allows you to modify WF workflows via the UI designer, the "Oslo Repository" which stores data against schemas about practically anything in your IT environment, and the “Oslo Process server” which will host WF and Biztalk transformations.

That's it for now. Till next time. DDK

Friday, 22 August 2008

"'Edit Document' requires a Windows SharePoint Services-compatible application and Microsoft Internet Explorer 6.0 or greater." - Problem Fixed


I had this problem for about a week when trying to edit documents from Sharepoint 2007. Indeed, the problem became so frustrating that I needed to fix it. The error I was getting was:

---------------------------
Windows Internet Explorer
---------------------------
'Edit Document' requires a Windows SharePoint Services-compatible application and Microsoft Internet Explorer 6.0 or greater.
---------------------------
OK
---------------------------


I came to this article:

http://support.microsoft.com/default.aspx/kb/833714 and followed the instructions:



  1. Via Add/Remove Programs ->Office 2003 -> Change dialog, I removed and readded the Windows SharePoint Services Support component as per Method 2 in the article above.



  2. I tried to re-register the OWSUPP.DLL (with an uninstall and reinstall with regsvr32 "C:\Program Files\Microsoft Office\OFFICE11\owssupp.dll") and just got an error:



    ---------------------------
    RegSvr32
    ---------------------------
    DllRegisterServer in OWSSUPP.DLL failed.
    Return code was: 0x80070716
    ---------------------------
    OK
    --------------------------

  3. I also tried to re-register the Office 2007 copy of the dll:
    C:\Program Files\Microsoft Office\Office12\OWSSUPP.DLL



  4. Curious, I also rab dependency walker http://www.dependencywalker.com/. I dragged owssupp.dll into dependency walker and it indicated that I had Dlls missing as per the screenshot. I downloaded these from the web e.g. http://www.driverskit.net/dll/link/2263.html and moved these "missing" files to the office 11 directory - but the problem still ocurred.



  5. I finally gave up and bit the bullet and did a repair install of Office 2003 via Add/Remove Programs - (which didn’t require the install media and took about 10 mins) – and it all started working.

Aparrently, you may get this problem if various Windows XP service packs are installed or you uninstall any Visual Studio VSTO tools.


Wednesday, 20 August 2008

Violation of PRIMARY KEY constraint 'PK_PrimaryKeyName'. Cannot insert duplicate key in object 'dbo.TableName'

Today, one of my recently deployed apps was generating errors when attempting to insert records. The following errors started to appear in our Error logging table:

System.Data.SqlClient.SqlException. ...
Violation of PRIMARY KEY constraint 'PK_PrimaryKeyName'. Cannot insert duplicate key in object 'dbo.TableName'.

Even when attempting to insert data directly into the table via SQL Management Studio, the same error would occur. The source of the issue was that the identity seed values were out of sync with the actual values in the table (a result of doing inserts with IDENTITY_INSERT ON). The simple fix was to change to output text mode in SQL management studio and run the T-SQL query:



SELECT 'DBCC CHECKIDENT (' + Table_Name + ')' FROM information_schema.tables WHERE TABLE_TYPE = 'BASE TABLE'

Run the output of this query - this corrected all the 'duplicate key' issues I was having after the deploy of the database scripts.

Tuesday, 12 August 2008

DDK has registered for Microsoft Tech.Ed 2008 Australia... Thanks Oakton!

Oakton has recognized the hard work I've been doing at my current client - and has decided that myself and coworker Steven Krizanovic will be heading across to the other side of Darling Harbour to the geek extravaganza that is Tech.Ed 2008. The US edition of Tech.Ed has been divided into Developers and IT Professionals, but Australia keeps with the one stream format. Last time I went to Tech.Ed was in 1999 @ Dreamworld on the Gold Coast, so it's been a long time between drinks!

I'll be concentrating on the following tracks:
  1. SOA and Business Processes (which focuses on BizTalk, WCF and WF)

  2. Web (particularly Silverlight + WPF)

  3. Developer Tools & Technologies

  4. Architecture

  5. DB and BI tracks

I'll keep you posted with the most valuable tidbits as the event unfolds...

SQL Reporting Services Error: Logon failed. (rsLogonFailed) Logon failure: unknown user name or bad password. (Exception from HRESULT: 0x8007052E)



I've run into this SQL Reporting Services 2005 exception a couple of times now - typically when I have reporting services execution accounts run as domain users that have passwords which expire. You will get this error if the current credentials supplied for the report (or if none supplied, the current SQL Reporting Services Execution Account) are incorrect. This happened on my local machine and the simple fix was to update the password on my execution account.



Corrected the account username/pass... fixed!

Saturday, 19 July 2008

Cannot uninstall a .NET Windows Service with installutil /u when the service executable has been moved or deleted - Fix

There are a few perils unique to developing Windows Services in .NET. This is one of them.

The other day, I renamed some of my subversion working folders. Unfortunately, one of the folders that I renamed actually included a service that I had registered via installutil.exe on my local machine.

There is a problem with installutil.exe which means that this could be an unrecoverable Catch-22 situation. Here's why:


  1. You cannot uninstall it. If you try to uninstall it with installutil /u and point to your service (e.g. "uninstall /u DDK.ProjectName.MyNotificationServiceName", it cannot find the file will and give a "Exception occurred while initializing the installation: System.IO.FileNotFoundException: Could not load file or assembly '[Full Path To My File]' or one of its dependencies. The system cannot find the file specified.."

  2. You cannot install the exe to a different location with installutil because the service is already installed. If you do try to install it with a new path, you will just get the error "An exception occurred during the Install phase.System.ComponentModel.Win32Exception: The specified service already exists".

So to install a service with the same at the new location, I would have to:



  1. Copy the old file back to the original Windows Service Location location (or restore it from a backup) and run installutil /u on it. If I don't have the file anymore, I would not be able to do this.

  2. OR Remove the registry entry for the service.

This would not be as much of an issue if installutil /u recognized the missing service and prompted me if I wanted to remove the registry entry - but it didn't. I understand you want to do cleanup of a service when an uninstall is called - but you shouldn't be left in an unrecoverable state because of a lack of functionality in the core installer utility.

So when you don't have access to the file/drive you originally installed a service to, you can fix this unrecoverable (from the perspective of installutil) situation by either:


  1. Opening regedit

  2. Going to HKEY_LOCAL_MACHINE/SYSTEM/CurrentControlSet/Services

  3. Removing the old registry entry for your service.

OR


Running "sc" from a command prompt - see screenshot below for parameters:



Adding simple Gzip compression for a 40-60% reduction in page size on your ASP.NET 2.0 Site

UPDATE (29 October 2008): If you can, try to use the following:
http://www.codeproject.com/KB/aspnet/httpcompression.aspx
This has a some major benefits such as compressing your axd files, combining your css and javascript and minifying the output.

There are a few different ways to get Gzip compression happening on your site. These include:

  1. Custom Http modules that implement IHttpModule such as http://blowery.org/httpcompress/

  2. 3rd party handlers such as http://www.port80software.com/products/httpzip/

  3. If you have full access to the IIS Box and metabase, use the built-in Gzip compression available in IIS 6.0 and above (See http://weblogs.asp.net/owscott/archive/2004/01/12/57916.aspx for more information)

  4. Modifying the global.asax to implement compression.
I briefly outline option 4 below. With ASP.NET, it is incredibly easy to get it up and running without any additonal server set up. Note that it is important that you don't gzip your axd files through this code. Some UI components such as Telerik RadControls will generate several javascript errors if you try to gzip its axd resource files. I also found that a page I created for dynamically rendering images started chop images off at the bottom. So I excluded them from any attempts at compression.

If you look at your page size in YSLow, it will typically be reduced by 40-60%. e.g. from 200K to 100K.

Here's some sample code that you can put into your global.asax with minor modifications appropriate to your project:




  //Gzip support

    void Application_BeginRequest(object sender, EventArgs e)

    {

        HttpApplication app = (HttpApplication)sender;

        if (app.Request.Url.ToString().Contains("ImageGenerator.aspx")  app.Request.Url.ToString().Contains("WebResource.axd")  app.Request.Url.ToString().Contains("ScriptResource.axd")) //Dont process this as it corrupts images/scripts

            return;

        string acceptEncoding = app.Request.Headers["Accept-Encoding"];

        Stream prevUncompressedStream = app.Response.Filter;

 

        if (acceptEncoding == null  acceptEncoding.Length == 0)

            return;

 

        acceptEncoding = acceptEncoding.ToLower();

 

        if (acceptEncoding.Contains("gzip"))

        {

            // gzip

            app.Response.Filter = new GZipStream(prevUncompressedStream,

                CompressionMode.Compress);

            app.Response.AppendHeader("Content-Encoding",

                "gzip");

        }

        else if (acceptEncoding.Contains("deflate"))

        {

            // deflate

            app.Response.Filter = new DeflateStream(prevUncompressedStream,

                CompressionMode.Compress);

            app.Response.AppendHeader("Content-Encoding",

                "deflate");

        }

    }
 
 
 


For more information on IIS and the built-in settings when you have full access to IIS, see the following articles for reference:

http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/d52ff289-94d3-4085-bc4e-24eb4f312e0e.mspx?mfr=true
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true


The IIS compression dialog:





Removing all blank lines from a file with regular expressions in Visual Studio or UltraEdit

I often deal with files which have redundant empty lines in them. These are easily removed by either Visual Studio or one of the best text editors around, UltraEdit by IDM Solutions. The regular expression criteria matching blank lines in these 2 applications are slightly different (the end of line escape character "$" appears in a different order in each):

Visual Studio:

Press Ctrl+G
Select "Use Regular Expressions"
In Find specify "^$\n" (without the quotes).
Set the replace value to blank.
Click on "Replace All"

UltraEdit:

[from the UltraEdit FAQ at at http://www.ultraedit.com/support/faq.html]

To delete/strip blank lines from DOS/Unix/Mac formatted-files, use the following Perl-compatible regular expression. You can enable Perl-compatible regular expressions under Advanced -> Configuration -> Search -> Regular Expression Engine.

Replace: "^\r?\n?$" (without the quotes)
With "" (without the quotes - i.e. nothing).

Earlier versions of UltraEdit:

To delete blank lines with DOS line terminators you can use an UltraEdit-style regular expression replace as follows:

Replace: "^p$" (without the quotes)
With "" (without the quotes - i.e. nothing).

Run this replace until every blank line is deleted.

Amusing Image of the day from FailBlog.org...Why you should think before you post a comment on a blog :o)


You get the error "[Script Name].ps1 cannot be loaded because the execution of scripts is disabled on this system" when running Powershell scripts

If you get the error:
File D:\Sc\Global\DDK.Solution\dev\DDK.ProjectName\deploy.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please see "get-help about_signing" for more details.

You get this error because the default setting for Powershell is "Restricted" (aka locked down Alcatraz mode). In this mode, it does not load configuration files or run scripts.

To resolve this issue, you can run Powershell (powershell is typically in C:\WINNT\system32\WindowsPowerShell\v1.0\powershell.exe if it is not already in your path) and change the execution policy. For example, you could run the command "Set-ExecutionPolicy Unrestricted" if you want to allow unsigned scripts to run.

Once you have set your security policy appropriately, you can execute your powershell scripts without this error. See http://technet.microsoft.com/en-us/library/bb978644(TechNet.10).aspx for more information.




Deleting Folders in MOSS via Web Services and CAML

Unfortunately, the lists.asmx web service that you use to manipulate MOSS lists doesn't have a "Delete()" method for folders. However, there is an "UpdateListItems()" method that allows you to pass in an Xml element parameter called "batchElement" to provide this functionality. You then can manipulate folders in Sharepoint to your hearts content through this parameter.

The typical format for the batch element Xml fragment is:

<Batch OnError='Return'>
<Method ID='1' Cmd='Delete'>
<Field Name='ID'>81</Field>
<Field Name='FileRef'>http://dev-moss/sites/home/PropertySharePoint/DocumentLibrary/300</Field>
</Method>
</Batch>


Your delete is successful if the return value is zero. You can test this out in the U2U CAML query builder from http://www.u2u.info/Blogs/Patrick/Lists/Posts/Post.aspx?ID=1315



This batchElement information can be passed into the sharepoint list web service as demonstrated in the method snippet below:



  /// <summary>

        /// Delete folders as per http://msdn2.microsoft.com/en-us/library/ms429658.aspx for LPP-205

        /// </summary>

        /// <param name="listName"></param>

        /// <param name="folderName"></param>

        /// <returns></returns>

        public XmlNode DeleteFolder(string listName, string folderName)

        {

            /*Use the CreateElement method of the document object to create elements for the parameters that use XML.*/

            System.Xml.XmlDocument xmlDoc = new System.Xml.XmlDocument();

            XmlElement query = xmlDoc.CreateElement("Query");

            XmlElement viewFields = xmlDoc.CreateElement("ViewFields");

            XmlElement queryOptions = xmlDoc.CreateElement("QueryOptions");

            string rowLimit = int.MaxValue.ToString();

            /*To specify values for the parameter elements (optional), assign CAML fragments to the InnerXml property of each element.*/

            System.Text.StringBuilder sb= new System.Text.StringBuilder();

            sb.Append("<Where><Eq><FieldRef Name=\"Title\" />");

            sb.Append(string.Format("<Value Type=\"Text\">{0}</Value></Eq></Where>", folderName));

            viewFields.InnerXml = "<FieldRef Name=\"ID\" /><FieldRef Name=\"Title\" />";

            query.InnerXml =  sb.ToString();

            queryOptions.InnerXml = "";

            System.Xml.XmlNode nodeListItems = _listWebService.GetListItems(listName, string.Empty, query, viewFields, rowLimit, queryOptions, null);

            string folderId = string.Empty;

            string fileRef = string.Empty; 

            XmlDocument doc = new XmlDocument();

            doc.LoadXml(nodeListItems.InnerXml);

            XmlNamespaceManager nsmgr = new XmlNamespaceManager(doc.NameTable);

            nsmgr.AddNamespace("z", "#RowsetSchema");

            nsmgr.AddNamespace("rs", "urn:schemas-microsoft-com:rowset");

            XmlNodeList xmlNodeList = doc.SelectNodes("/rs:data/z:row", nsmgr);

            foreach (XmlNode node in xmlNodeList)

            {

                folderId = node.Attributes["ows_ID"].Value;

                //fileRef = node.Attributes["ows_EncodedAbsUrl"].Value;

                fileRef = node.Attributes["ows_FileRef"].Value.Substring(node.Attributes["ows_FileRef"].Value.IndexOf("#") + 1);

                break;

            }

            System.Xml.XmlNode result = null; //Will be populated response from update batch.

            if (folderId != string.Empty)

            {

                System.IO.StringWriter sw = new System.IO.StringWriter();

                System.Xml.XmlTextWriter xw = new System.Xml.XmlTextWriter(sw);

                xw.WriteStartDocument();

                // build batch node

                xw.WriteStartElement("Batch");

                xw.WriteAttributeString("OnError", "Return");

                // Build method node

                xw.WriteStartElement("Method");

                // Set transaction ID - doesn't really matter what the number is

                xw.WriteAttributeString("ID", System.Guid.NewGuid().ToString("n"));

                xw.WriteAttributeString("Cmd", "Delete");

                // Build field ID

                xw.WriteStartElement("Field");

                xw.WriteAttributeString("Name", "ID");

                xw.WriteString(folderId);

                xw.WriteEndElement(); // Field end

                // Build FileRef

                xw.WriteStartElement("Field");

                xw.WriteAttributeString("Name", "FileRef");

                xw.WriteString(fileRef);

                xw.WriteEndElement(); // Field end

                xw.WriteEndElement(); // Method end

                xw.WriteEndElement(); // Batch end

                xw.WriteEndDocument();

                System.Xml.XmlDocument batchElement = new System.Xml.XmlDocument();

                batchElement.LoadXml(sw.GetStringBuilder().ToString());

                //Setup web service

                // send update request to sharepoint to create the document folder

                result = _listWebService.UpdateListItems(listName, batchElement);

            }

            return result;

        }

Tuesday, 8 July 2008

TortoiseSVN 1.5 and svnmerge Issue - "svn: This client is too old to work with working copy '.'; please get a newer Subversion client"


WARNING: TortoiseSVN 1.5 does silent upgrades (aka the touch of death) on your SVN working copies that renders it unusable with older SVN clients. This affects clients such as the 1.4 based version of svnmerge (the current version as I write this).

Before the most recent 1.5 upgrade for TortoiseSVN, the last major version was made over 2 years ago. So you could imagine that I was quite keen to upgrade to latest version when my TortoiseSVN notified me that I should upgrade to version 1.5.

I've generally had a good experience with TortoiseSVN thus far so I bit the bullet and downloaded the latest and greatest version of this handy tool. I only uncovered the implcations of this upgrade when all my svnmerge scripts started to fail (we use the svnmerge utility via a batch file to deploy our changes to trunk and onto our build server). Then things hit the proverbial fan. I started to get the following errors for all svnmerge operations (including simple status calls) :

"svn: This client is too old to work with working copy '.'; please get a newer Subversion client"

I thought it was unusual that:

  1. TortoiseSVN installer would affect svnmerge at all (I assumed some shared DLLs had been updated).
  2. (on a more minor note) That it would start telling me that I had an old version of the client tool when I had the very latest versions of TortoiseSVN and svnmerge (http://www.subversionary.org/binaries/installer-for-svnmerge). I would hope that a tool as popular as svnmerge would be updated within days of a new client version being released.

I thought the rollback process would be as simple as uninstalling 1.5 of TortoiseSVN - but I uninstalled and changed back to 1.4.8 - and got the same error! Even the 1.4 versions of svn started to get the same error.

It turns out that once you start working with the new version of the tool, the SVN metadata is silently upgraded to the latest version. Unfortunately, if you want to keep using svnmerge, the fix for this issue (until a new version of svnmerge comes out) is to:

  1. Roll back to a 1.4 version
  2. Essentially throw away your working copy
  3. Do a fresh checkout.

This will get you back to the place you started. This could take a while - especially if you're in Australia and the subversion server is in Atlanta! Phew!

The TortoiseSVN 1.5 touch of death at work (it is not backwards compatible with 1.4 clients):