
The Musings and Findings of Software Consultant David Klein (Sydney, Australia)
Friday, 20 June 2008
Performance Profiling in Firefox 3 and the YSlow Addin for FireBug

Tuesday, 17 June 2008
Using the power of PowerShell to Configure your Web.Config for Different Environments,Customize the Output of SQLMetal and Bulk Update Word Documents
With this in mind, it is extremely handy as a tool for ANYTHING XML or done previously in batch files:
- Modifying your web.config as part of your build process during deploy. In a previous post, Scott Guthrie has recommended using multiple config files, one for each environment (See http://weblogs.asp.net/scottgu/archive/2007/09/21/tip-trick-automating-dev-qa-staging-and-production-web-config-settings-with-vs-2005.aspx). The problem with this is you have complete duplication of settings files. This means it is more likely you will forget to update a setting for all different environments. Using one master config file that is configured for each environment by powershell is much better. Powershell can also do all your zipping, and FTPing to staging servers.
- Deployment and customization of Reporting Services Reports. RDL is a standard XML language can be customized if you don't like the output of the Visual Studio RDL editor/designer.
- Updating Word and Excel 2007 Documents automatically. The docx format is now all XML - so powershell can perform bulk updates to Word documents for you without the need to use Word automation or a separate executable.
- Updating LINQ DBML files to your liking. I will detail an example of this below.
This batch file and powershell script does the following to fix up what the SQLMetal dbml generator doesn't do:
- removes tables that are not needed in the DBML
- renames associations so they are more meaningful (e.g. if you have multiple relationships between 2 tables, they will get names such as Lookup1, Lookup2, so your LINQ code using these relationships won't make much sense.)
The Batch File:
sqlmetal /conn:"data source=myServerName\dev;initial catalog=MyDataBaseName;trusted_connection=false;Persist Security Info=True;User ID=UserName;Password=Password;" /dbml:PropertyPipelineDb.dbml /context:InvestmentManagementDataContext /namespace:LendLease.InvestmentManagement.Domain.Entity /pluralize /entitybase:"LendLease.Framework.Domain.Entity.MutableEntity" /views
powershell -command "$temp = .\CleanDbml.ps1 -dbml '.\PropertyPipelineDb.dbml'; $temp Out-File '.\PropertyPipelineDb.dbml' -Encoding Unicode"
sqlmetal /code:PropertyPipelineDb.designer.cs PropertyPipelineDb.dbml
pause
The PowerShell Script (CleanDBML.ps1)
Param
(
$DBML # File Path to DBML file
)
$removeTables = ("dbo.vRolePermission", `
"dbo.vOpportunityFlattenedFundInterests", `
"dbo.vOpportunityDetailsWithFlatFundInterests", `
"dbo.vAssetAddress", `
"dbo.vContactOpportunities", `
"dbo.vContactOpportunitiesWithFlatFundInterests", `
"dbo.vLookupHierarchy", `
"dbo.CachedUser", `
"dbo.CachedUserExtension", `
"dbo.OpportunityImport" `
);
$renameTables = @{ "dbo.vOpportunitySearch" = @{ Type = "vOpportunitySearch" ; Member = "vOpportunitySearches"}; `
"Code.NotificationType" = @{ Type = "NotificationType" ; Member = "NotificationTypes"}; `
"dbo.vPropertyPipelineFund" = @{ Type = "Fund" ; Member = "Funds"} `
};
$renameAssosciations = @{ "FK_OpportunityRelationship_Opportunity1" = @{ FKMember = "Opportunity1"; PluralMember = "OpportunityRelationship1s"}; `
"FK_OpportunityRelationship_Opportunity2" = @{ FKMember = "Opportunity2"; PluralMember = "OpportunityRelationship2s"}; `
"FK_CompanyCompanyType_Company" = @{ FKMember = "Company"; PluralMember = "CompanyTypes"}; `
"FK_NotificationType_NotificationType" = @{ FKMember = "ParentNotificationType"; PluralMember = "ChildNotificationTypes"} `
};
$exists = Test-Path ($DBML);
if (! $exists)
{
$DBML = Join-Path -path $pwd -childpath $DBML;
}
$exists = Test-Path ($DBML);
if( $exists )
{
$dbmlFileInfo = Get-Item -path $DBML;
[System.Xml.XmlReader]$local:reader = [System.Xml.XmlReader]::Create($dbmlFileInfo);
[System.Xml.XmlDocument]$local:doc = new-object System.Xml.XmlDocument;
$doc.set_PreserveWhitespace($true);
$doc.Load($reader);
$reader.Close();
[System.Xml.XmlElement]$local:root = $doc.get_DocumentElement();
# Remove nodes that don't belong
$removedTypeMap = @{};
$tableNodesRemoved = $doc.Database.Table `
ForEach-Object { $_; } `
Where-Object { $removeTables -contains $_.Name } `
ForEach-Object { $removedTypeMap.Add($_.Type.Name, $_.Name); $_ = $doc.Database.RemoveChild($_); $_; };
# Remove any assosciations on other tables which reference a removed type
$assosciationNodesWithTypesRemoved = $doc.Database.Table `
Where-Object { $_.Type.Association -ne $null } `
ForEach-Object { $_.Type.Association; } `
Where-Object { $removedTypeMap[$_.Type] -ne $null } `
ForEach-Object { $_ = $_.get_ParentNode().RemoveChild($_); $_; };
# Rename nodes for tables with their new aliases
$tableNodesToRename = $doc.Database.Table `
ForEach-Object { $_; } `
Where-Object { $renameTables[$_.Name] -ne $null };
$renamedTypeMap = @{};
$tableNodesToRename ForEach-Object { $newName = $renameTables[$_.Name].Type; $renamedTypeMap.Add($_.Type.Name, $newName); $_.Type.Name = $newName; };
$renamedMemberMap = @{};
$tableNodesToRename ForEach-Object { $newName = $renameTables[$_.Name].Member; $renamedMemberMap.Add($_.Member, $newName); $_.Member = $newName; };
# Fix up any assosciations on other tables which reference a renamed type
$assosciationNodesWithTypesRenamed = $doc.Database.Table `
Where-Object { $_.Type.Association -ne $null } `
ForEach-Object { $_.Type.Association; } `
Where-Object { $renamedTypeMap[$_.Type] -ne $null} `
ForEach-Object { $_.Type = $renamedTypeMap[$_.Type]; $_; };
# Fix up member names for any assosciations on other tables which are Foreign Key assosciations and reference a renamed type
$assosciationNodesWithFKMembersRenamed = $assosciationNodesWithTypesRenamed `
Where-Object { $renamedTypeMap[$_.Member] -ne $null -and $_.IsForeignKey -eq "true"} `
ForEach-Object { $_.Member = $renamedTypeMap[$_.Member]; $_; };
# Fix up member names for any assosciations on other tables which are Subset assosciations and reference a renamed member
$assosciationNodesWithSubsetMembersRenamed = $assosciationNodesWithTypesRenamed `
Where-Object { $renamedMemberMap[$_.Member] -ne $null -and $_.IsForeignKey -eq $null } `
ForEach-Object { $_.Member = $renamedMemberMap[$_.Member]; $_; };
$assosciationFKNodesRenamed = $doc.Database.Table `
Where-Object { $_.Type.Association -ne $null } `
ForEach-Object { $_.Type.Association; } `
Where-Object { $renameAssosciations[$_.Name] -ne $null -and $_.IsForeignKey -eq "true"} `
ForEach-Object { $_.Member = $renameAssosciations[$_.Name].FKMember; $_; };
$assosciationSubsetNodesRenamed = $doc.Database.Table `
Where-Object { $_.Type.Association -ne $null } `
ForEach-Object { $_.Type.Association; } `
Where-Object { $renameAssosciations[$_.Name] -ne $null -and $_.IsForeignKey -eq $null} `
ForEach-Object { $_.Member = $renameAssosciations[$_.Name].PluralMember; $_; };
[System.Xml.XmlWriterSettings]$writerSettings = new-object System.Xml.XmlWriterSettings;
$writerSettings.set_Indent($true);
$writerSettings.set_NewLineHandling([System.Xml.NewLineHandling]::Replace);
$writerSettings.set_CloseOutput($true);
$local:outputStream = new-object System.IO.StringWriter;
[System.Xml.XmlWriter]$local:writer = [System.Xml.XmlWriter]::Create($outputStream, $writerSettings);
$doc.Save($writer);
$writer.Flush();
$writer.Close();
$outputStream.Close();
$outputStream.ToString();
}
To assist your development of powershell scripts, you can use PowerShell Plus (http://www.powershell.com/) to parse and debug your scripts. Another amazing utility is the powershell add-in for reflector: http://www.hanselman.com/blog/ReflectorAddinsAndPowerShellLanguageSupportForReflector.aspx. This actually attempts to disassemble existing .NET DLLs to powershell scripts.
There are a whole host of plugins for Lutz Roeders application mentioned in the Hanselman article above (the Reflector compare of different dlls looks very useful) -but I'll have to save my comments for a later blog post...
DDK
Using Impersonation Batch Files and RunAs to Assist Web Application Testing

For example, you can use the following "PP_Test_ImpersonateSuperUser.bat" file to impersonate a Super User without logging off and logging in as that user. It will also open up IE for the http://pp-test web site (an intranet site).
@ECHO ******Running as SUPER USER (IMTEST11); password is SuperUsersPassWord
@ECHO OFF
COLOR 0A
runas /user:APAC\imtest11 /savecred "%ProgramFiles%/Internet Explorer/iexplore.exe \"pp-test\""
pause
You can now test to your hearts content without having to log off your normal Windows account or logging on into Terminal Services on another machine as a different user.
Monday, 19 May 2008
Simple Auditing with LINQ to SQL - Date, Time and User Stamps
There are many ways you can populate application audit tables or audit fields. These include:
- With Triggers (but this relies on the SQL authenication mechanism for accurate user data);
- With stored procedures (this also relies on the accuracy of SQL authentication information or you need to append the user name to the parameters of the procedure.)
- On the Application Side - Manually coding all service/database calls to append this information.
- Use a SQL Server log auditing tool like LogExplorer that tracks the SQL Server transaction log.
- Leveraging your data access layer's update point to append this information before the update takes place.
Below you can find my code which takes the last approach - and uses reflection to stamp records with when a record was created/updated and who did the insert/update. This is a simplified alternative to an audit solution (such as http://blog.matthidinger.com/2008/05/09/LINQToSQLAuditTrail.aspx) which has a full audit table and where requirements are just to display who last modified or created a particular record in your application:
/// <summary>
/// Basic Audit User and Date Stamp Functionality
/// </summary>
/// <param name="failureMode"></param>
public override void SubmitChanges(ConflictMode failureMode)
{
//Updates
for (int changeCounter = 0; changeCounter < this.GetChangeSet().Updates.Count; changeCounter++)
{
object modifiedEntity = this.GetChangeSet().Updates[changeCounter];
SetAuditStamp(this, modifiedEntity, ChangeType.Update);
}
//Inserts
for (int changeCounter = 0; changeCounter < this.GetChangeSet().Inserts.Count; changeCounter++)
{
object modifiedEntity = this.GetChangeSet().Inserts[changeCounter];
SetAuditStamp(this, modifiedEntity, ChangeType.Insert);
}
base.SubmitChanges(failureMode);
}
/// <summary>
/// For Inserts or Updates - set the user and date stamps
/// </summary>
/// <param name="context"></param>
/// <param name="modifiedEntity"></param>
private void SetAuditStamp(DataContext context, object modifiedEntity, ChangeType changeType)
{
string userName = System.Threading.Thread.CurrentPrincipal.Identity.Name;
const string Created = "Created", CreatedBy = "CreatedBy",
Modified = "Modified", ModifiedBy = "ModifiedBy";
if (changeType == ChangeType.Insert)
{
SetAuditValue(modifiedEntity, Created, System.DateTime.Now);
SetAuditValue(modifiedEntity, CreatedBy, userName);
}
else if (changeType == ChangeType.Update)
{
SetAuditValue(modifiedEntity, Modified, System.DateTime.Now);
SetAuditValue(modifiedEntity, ModifiedBy, userName);
}
}
/// <summary>
/// The type of modifications
/// </summary>
private enum ChangeType
{
Update,
Insert
}
/// <summary>
/// Set target value if it exists on the object
/// </summary>
/// <param name="modifiedEntity"></param>
/// <param name="fieldName"></param>
/// <param name="propertyValue"></param>
private void SetAuditValue(object modifiedEntity, string fieldName, object propertyValue)
{
if (modifiedEntity.GetType().GetProperty(fieldName) != null) //Set current user and time stamp
{
modifiedEntity.GetType().GetProperty(fieldName).SetValue(modifiedEntity, propertyValue, null);
}
}
Friday, 16 May 2008
Microsoft SQL Server 2005 Express Edition with Advanced Services (aka with Reports) - Now Available
The spiel for this new download is:
"Free to download, free to deploy, and free to redistribute as an embedded part of an application, SQL Server Express with Advanced Services is the fast and easy way to develop and manage data-driven applications with powerful built-in reporting and full-text search functionality."
You can now find the answer to you redistributable reporting and full-text indexing issues right here: http://msdn.microsoft.com/en-us/express/aa718379.aspx
Thursday, 15 May 2008
Automatically Generating Audit Columns for your Database - no monkey work!
Open a query on your database in SQL 2005 Management Studio. Change your output mode to "text", and then run it by pressing F5. The output window will have your script to run for all tables in your database - table auditing heaven!
SELECT 'ALTER TABLE ' + TABLE_NAME + ' ADD Modified datetime NOT NULL DEFAULT GETDATE(),
ModifiedBy nvarchar(255) NOT NULL DEFAULT USER_NAME(),
Created datetime NOT NULL DEFAULT GETDATE(),
CreatedBy nvarchar(255) NOT NULL DEFAULT USER_NAME() ' FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
AND TABLE_NAME <> 'sysdiagrams'
Friday, 9 May 2008
Resetting an Identity Field to 0 when testing a data migration process
Never fear - It IS possible to reset the identity seed value by using the "DBCC CHECKIDENT" command. For example, to reset a table so that the next identity values is 1, you can run the following as part of your data migration script:
DBCC CHECKIDENT('TableName', RESEED, 0)
This will force the next value generated to be 1. Also note that the DBCC CHECKIDENT command even works with fully qualified names for the tablename parameter. For example:
ALTER PROCEDURE [dbo].[proc_InitializePrimaryTables]
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
TRUNCATE TABLE InvestmentManagementDataMigration..OpportunityStatusHistory
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunityStatusHistory', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..OpportunityUserRole
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunityUserRole', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..FundInterest
DBCC CHECKIDENT('InvestmentManagementDataMigration..FundInterest', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..OpportunitySourceContact
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunitySourceContact', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..AssetOwnership
DBCC CHECKIDENT('InvestmentManagementDataMigration..AssetOwnership', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..AssetRelationship
DBCC CHECKIDENT('InvestmentManagementDataMigration..AssetRelationship', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Opportunity
DBCC CHECKIDENT('InvestmentManagementDataMigration..Opportunity', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Contact
DBCC CHECKIDENT('InvestmentManagementDataMigration..Contact', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Company
DBCC CHECKIDENT('InvestmentManagementDataMigration..Company', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Address
DBCC CHECKIDENT('InvestmentManagementDataMigration..Address', RESEED, 0)
END
Trivia:
Just to be tricky, I tried setting the seed to a negative value (-1) just to see what this would do...and the next value generated was 10. I don't know about you, but last time I checked, 10 does not follow -1!
SSIS Excel Data Source Error - “Text was truncated or one or more characters had no match in the target code page.”
I also hit this error at my previous company http://www.ssw.com.au/ when I was doing migration from Excel spreadsheets via SQL Server Integration Services (SSIS) for my client Queensland Water Infrastructure.
As described by my http://www.oakton.com.au/ colleague Craig at http://waxtadpole.wordpress.com/2008/04/28/hello-world/, this issue also came up at our current client "LL". The issue is that the sample taken by the JET driver to determine the input datatype on the Excel spreadsheet is just not large enough. e.g. if you have 100 characters for "Column A" on every row except for line 2000, then it will not be picked up and the data type will be set to Unicode string [DT_WSTR] 255 - it will then fail with a truncation error. Even if you override it in the Advanced tab, SSIS will keep setting the size back to the value of 255. The fix is to either:
- Put a large placeholder on the first line of the spreadsheet to force the detection of a larger size field (so it gets promoted to a "Unicode text stream [DT_NTEXT]")
- Increase the sample size so that it picks up the correct field size (Recommended). You can increase the number of rows sampled by increasing the value of TypeGuessRows under the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel registry key. By default, this is the ludicrously small value of 8(!) lines. For a more reasonable 100 lines, you can copy the following into a .reg file and double click to update your registry:
Windows Registry Editor Version 5.00[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel]"TypeGuessRows"=dword:00000064
After you do this, it will suddenly realize that the column settings are too small and update the data type to "Unicode text stream [DT_NTEXT]".

Also note that you should close your spreadsheet before opening your SSIS package - otherwise it will wait indefinitely with "Visual Studio is Busy" messages!