Friday, 20 June 2008

Performance Profiling in Firefox 3 and the YSlow Addin for FireBug

YSlow (sometimes mistakenly written as "WhySlow") is a very handy utility from the Yahoo Developer team (http://developer.yahoo.com/yslow/) which allows you to profile the performance of your pages in FireFox. It requires FireBug to be installed first, and runs inside your FireBug panel.


One of the simple and very handy features that the Internet Explorer Fiddler Tool 2 (http://www.fiddlertool.com/fiddler/) doesn't currently have is the page size and page timing toolbar in the bottom right hand corner. I recall that Netscape used to have this about 15 years ago, and Safari and Opera current have this feature. The only reason I can see why IE and Firefox don't have this by default is that the masses would get unhappy when they realize just how slow their pages are currently loading!


From now on, this will be my first port of call when making speed and size comparisons between sites.


Tuesday, 17 June 2008

Using the power of PowerShell to Configure your Web.Config for Different Environments,Customize the Output of SQLMetal and Bulk Update Word Documents

Microsoft PowerShell (http://www.microsoft.com/windowsserver2003/technologies/management/powershell/default.mspx) is an incredibly powerful tool because it has a particularly elegant syntax for manipulating XML files - with no XML DOM code required. It has many other features such as support for using .NET code inline (e.g. c#) - however I will just concentrate on the XML handling capabilities today.

With this in mind, it is extremely handy as a tool for ANYTHING XML or done previously in batch files:

  1. Modifying your web.config as part of your build process during deploy. In a previous post, Scott Guthrie has recommended using multiple config files, one for each environment (See http://weblogs.asp.net/scottgu/archive/2007/09/21/tip-trick-automating-dev-qa-staging-and-production-web-config-settings-with-vs-2005.aspx). The problem with this is you have complete duplication of settings files. This means it is more likely you will forget to update a setting for all different environments. Using one master config file that is configured for each environment by powershell is much better. Powershell can also do all your zipping, and FTPing to staging servers.
  2. Deployment and customization of Reporting Services Reports. RDL is a standard XML language can be customized if you don't like the output of the Visual Studio RDL editor/designer.
  3. Updating Word and Excel 2007 Documents automatically. The docx format is now all XML - so powershell can perform bulk updates to Word documents for you without the need to use Word automation or a separate executable.
  4. Updating LINQ DBML files to your liking. I will detail an example of this below.
An example of the power of PowerShell: If you are not happy with the output of SQLMetal (e.g. it generates LINQ entities for tables you don't want to see), you can use the following batch file to perform a "cleanup" of the tables you don't want in the dbml (Thanks to Andrew Weaver for this, for more powershell goodness, you can see his blog on http://leftlobed.wordpress.com/)

This batch file and powershell script does the following to fix up what the SQLMetal dbml generator doesn't do:
  • removes tables that are not needed in the DBML
  • renames associations so they are more meaningful (e.g. if you have multiple relationships between 2 tables, they will get names such as Lookup1, Lookup2, so your LINQ code using these relationships won't make much sense.)



The Batch File:



sqlmetal /conn:"data source=myServerName\dev;initial catalog=MyDataBaseName;trusted_connection=false;Persist Security Info=True;User ID=UserName;Password=Password;" /dbml:PropertyPipelineDb.dbml /context:InvestmentManagementDataContext /namespace:LendLease.InvestmentManagement.Domain.Entity /pluralize /entitybase:"LendLease.Framework.Domain.Entity.MutableEntity" /views

powershell -command "$temp = .\CleanDbml.ps1 -dbml '.\PropertyPipelineDb.dbml'; $temp Out-File '.\PropertyPipelineDb.dbml' -Encoding Unicode"

sqlmetal /code:PropertyPipelineDb.designer.cs PropertyPipelineDb.dbml

pause


The PowerShell Script (CleanDBML.ps1)



Param

(

    $DBML # File Path to DBML file

)

 

$removeTables = ("dbo.vRolePermission", `

                 "dbo.vOpportunityFlattenedFundInterests", `

                 "dbo.vOpportunityDetailsWithFlatFundInterests", `

                 "dbo.vAssetAddress", `

                 "dbo.vContactOpportunities", `

                 "dbo.vContactOpportunitiesWithFlatFundInterests", `

                 "dbo.vLookupHierarchy",  `

                 "dbo.CachedUser", `

                 "dbo.CachedUserExtension", `

                 "dbo.OpportunityImport" `

                );

                

$renameTables = @{ "dbo.vOpportunitySearch" = @{ Type = "vOpportunitySearch" ; Member = "vOpportunitySearches"}; `

                   "Code.NotificationType" = @{ Type = "NotificationType" ; Member = "NotificationTypes"};  `

                   "dbo.vPropertyPipelineFund" = @{ Type = "Fund" ; Member = "Funds"}  `

                 };

 

$renameAssosciations = @{ "FK_OpportunityRelationship_Opportunity1" = @{ FKMember = "Opportunity1"; PluralMember = "OpportunityRelationship1s"}; `

                          "FK_OpportunityRelationship_Opportunity2" = @{ FKMember = "Opportunity2"; PluralMember = "OpportunityRelationship2s"}; `

                          "FK_CompanyCompanyType_Company" = @{ FKMember = "Company"; PluralMember = "CompanyTypes"}; `

                          "FK_NotificationType_NotificationType" = @{ FKMember = "ParentNotificationType"; PluralMember = "ChildNotificationTypes"} `

                        };

 

$exists = Test-Path ($DBML);

if (! $exists)

{

    $DBML = Join-Path -path $pwd -childpath $DBML;

}

 

$exists = Test-Path ($DBML);

if( $exists )

{

    $dbmlFileInfo = Get-Item -path $DBML;

 

    [System.Xml.XmlReader]$local:reader = [System.Xml.XmlReader]::Create($dbmlFileInfo);

    [System.Xml.XmlDocument]$local:doc = new-object System.Xml.XmlDocument;

    $doc.set_PreserveWhitespace($true);

    $doc.Load($reader);

    

    $reader.Close();

    

    [System.Xml.XmlElement]$local:root = $doc.get_DocumentElement();

 

    # Remove nodes that don't belong

    $removedTypeMap = @{};

    $tableNodesRemoved = $doc.Database.Table `

                             ForEach-Object { $_; } `

                             Where-Object { $removeTables -contains $_.Name } `

                             ForEach-Object { $removedTypeMap.Add($_.Type.Name, $_.Name); $_ = $doc.Database.RemoveChild($_); $_; };

 

    # Remove any assosciations on other tables which reference a removed type

    $assosciationNodesWithTypesRemoved = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $removedTypeMap[$_.Type] -ne $null } `

                             ForEach-Object { $_ = $_.get_ParentNode().RemoveChild($_); $_; };

    

    # Rename nodes for tables with their new aliases

    $tableNodesToRename = $doc.Database.Table `

                             ForEach-Object { $_; } `

                             Where-Object { $renameTables[$_.Name] -ne $null };

    

    $renamedTypeMap = @{};

    $tableNodesToRename  ForEach-Object { $newName = $renameTables[$_.Name].Type; $renamedTypeMap.Add($_.Type.Name, $newName); $_.Type.Name = $newName; };

 

    $renamedMemberMap = @{};

    $tableNodesToRename  ForEach-Object { $newName = $renameTables[$_.Name].Member; $renamedMemberMap.Add($_.Member, $newName); $_.Member = $newName; };

    

    # Fix up any assosciations on other tables which reference a renamed type

    $assosciationNodesWithTypesRenamed = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $renamedTypeMap[$_.Type] -ne $null} `

                             ForEach-Object { $_.Type = $renamedTypeMap[$_.Type]; $_; };

 

    # Fix up member names for any assosciations on other tables which are Foreign Key assosciations and reference a renamed type

    $assosciationNodesWithFKMembersRenamed = $assosciationNodesWithTypesRenamed `

                             Where-Object { $renamedTypeMap[$_.Member] -ne $null -and $_.IsForeignKey -eq "true"} `

                             ForEach-Object { $_.Member = $renamedTypeMap[$_.Member]; $_; };

 

    # Fix up member names for any assosciations on other tables which are Subset assosciations and reference a renamed member

    $assosciationNodesWithSubsetMembersRenamed = $assosciationNodesWithTypesRenamed `

                             Where-Object { $renamedMemberMap[$_.Member] -ne $null -and $_.IsForeignKey -eq $null } `

                             ForEach-Object { $_.Member = $renamedMemberMap[$_.Member]; $_; };

                            

    $assosciationFKNodesRenamed = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $renameAssosciations[$_.Name] -ne $null -and $_.IsForeignKey -eq "true"} `

                             ForEach-Object { $_.Member = $renameAssosciations[$_.Name].FKMember; $_; };

                            

    $assosciationSubsetNodesRenamed = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $renameAssosciations[$_.Name] -ne $null -and $_.IsForeignKey -eq $null} `

                             ForEach-Object { $_.Member = $renameAssosciations[$_.Name].PluralMember; $_; };

            

    [System.Xml.XmlWriterSettings]$writerSettings = new-object System.Xml.XmlWriterSettings;

    $writerSettings.set_Indent($true);

    $writerSettings.set_NewLineHandling([System.Xml.NewLineHandling]::Replace);

    $writerSettings.set_CloseOutput($true);

    

    $local:outputStream = new-object System.IO.StringWriter;

    [System.Xml.XmlWriter]$local:writer = [System.Xml.XmlWriter]::Create($outputStream, $writerSettings);

    

    $doc.Save($writer);

    $writer.Flush();

    $writer.Close();

    $outputStream.Close();

    

    $outputStream.ToString();

}

To assist your development of powershell scripts, you can use PowerShell Plus (http://www.powershell.com/) to parse and debug your scripts. Another amazing utility is the powershell add-in for reflector: http://www.hanselman.com/blog/ReflectorAddinsAndPowerShellLanguageSupportForReflector.aspx. This actually attempts to disassemble existing .NET DLLs to powershell scripts.

There are a whole host of plugins for Lutz Roeders application mentioned in the Hanselman article above (the Reflector compare of different dlls looks very useful) -but I'll have to save my comments for a later blog post...


DDK

Using Impersonation Batch Files and RunAs to Assist Web Application Testing

The current ASP.NET 3.5 Web application I am working on uses Integrated Windows Authentication to determine group membership and their associated user permissions. While this is great as the users don't have to ever enter login details, the issue is that it is difficult for UAT testers to test functionality under different roles. Never fear... to facilitate ease of testing, you can use the "runas" functionality in Windows 2000/2003/2008/XP/Vista to allow users to impersonate different test users in Active Directory.



For example, you can use the following "PP_Test_ImpersonateSuperUser.bat" file to impersonate a Super User without logging off and logging in as that user. It will also open up IE for the http://pp-test web site (an intranet site).


@ECHO ******Running as SUPER USER (IMTEST11); password is SuperUsersPassWord

@ECHO OFF

COLOR 0A



runas /user:APAC\imtest11 /savecred "%ProgramFiles%/Internet Explorer/iexplore.exe \"pp-test\""



pause






You can now test to your hearts content without having to log off your normal Windows account or logging on into Terminal Services on another machine as a different user.


Monday, 19 May 2008

Simple Auditing with LINQ to SQL - Date, Time and User Stamps

My previous post was on automatically generating basic auditing fields on SQL Server tables via SQL scripts. Today, I examine the other side of the coin - using the LINQ data context to stamp records with audit fields such as "Modified","ModifiedBy", "Created" and "CreatedBy". These stamps are similar to the functionality provided by the windows file system.

There are many ways you can populate application audit tables or audit fields. These include:
  1. With Triggers (but this relies on the SQL authenication mechanism for accurate user data);
  2. With stored procedures (this also relies on the accuracy of SQL authentication information or you need to append the user name to the parameters of the procedure.)
  3. On the Application Side - Manually coding all service/database calls to append this information.
  4. Use a SQL Server log auditing tool like LogExplorer that tracks the SQL Server transaction log.
  5. Leveraging your data access layer's update point to append this information before the update takes place.

Below you can find my code which takes the last approach - and uses reflection to stamp records with when a record was created/updated and who did the insert/update. This is a simplified alternative to an audit solution (such as http://blog.matthidinger.com/2008/05/09/LINQToSQLAuditTrail.aspx) which has a full audit table and where requirements are just to display who last modified or created a particular record in your application:




/// <summary>
/// Basic Audit User and Date Stamp Functionality
/// </summary>
/// <param name="failureMode"></param>
public override void SubmitChanges(ConflictMode failureMode)
{

//Updates
for (int changeCounter = 0; changeCounter < this.GetChangeSet().Updates.Count; changeCounter++)
{
object modifiedEntity = this.GetChangeSet().Updates[changeCounter];
SetAuditStamp(this, modifiedEntity, ChangeType.Update);
}

//Inserts
for (int changeCounter = 0; changeCounter < this.GetChangeSet().Inserts.Count; changeCounter++)
{
object modifiedEntity = this.GetChangeSet().Inserts[changeCounter];
SetAuditStamp(this, modifiedEntity, ChangeType.Insert);
}

base.SubmitChanges(failureMode);
}

/// <summary>
/// For Inserts or Updates - set the user and date stamps
/// </summary>
/// <param name="context"></param>
/// <param name="modifiedEntity"></param>
private void SetAuditStamp(DataContext context, object modifiedEntity, ChangeType changeType)
{
string userName = System.Threading.Thread.CurrentPrincipal.Identity.Name;
const string Created = "Created", CreatedBy = "CreatedBy",
Modified = "Modified", ModifiedBy = "ModifiedBy";

if (changeType == ChangeType.Insert)
{
SetAuditValue(modifiedEntity, Created, System.DateTime.Now);
SetAuditValue(modifiedEntity, CreatedBy, userName);
}
else if (changeType == ChangeType.Update)
{
SetAuditValue(modifiedEntity, Modified, System.DateTime.Now);
SetAuditValue(modifiedEntity, ModifiedBy, userName);
}
}

/// <summary>
/// The type of modifications
/// </summary>
private enum ChangeType
{
Update,
Insert
}

/// <summary>
/// Set target value if it exists on the object
/// </summary>
/// <param name="modifiedEntity"></param>
/// <param name="fieldName"></param>
/// <param name="propertyValue"></param>
private void SetAuditValue(object modifiedEntity, string fieldName, object propertyValue)
{
if (modifiedEntity.GetType().GetProperty(fieldName) != null) //Set current user and time stamp
{
modifiedEntity.GetType().GetProperty(fieldName).SetValue(modifiedEntity, propertyValue, null);
}
}

Friday, 16 May 2008

Microsoft SQL Server 2005 Express Edition with Advanced Services (aka with Reports) - Now Available

An often requested feature for SQL Server Express Edition is to be able to run Reporting Services reports locally on a development machine/as part of a deployment of reporting services. Up until now, you needed to have a full version of SQL Server running to be able to run Reporting Services reports. This restricted its applicability for packaged product deployment - and is where Crystal has had a lead... until now.

The spiel for this new download is:
"Free to download, free to deploy, and free to redistribute as an embedded part of an application, SQL Server Express with Advanced Services is the fast and easy way to develop and manage data-driven applications with powerful built-in reporting and full-text search functionality."

You can now find the answer to you redistributable reporting and full-text indexing issues right here: http://msdn.microsoft.com/en-us/express/aa718379.aspx

Thursday, 15 May 2008

Automatically Generating Audit Columns for your Database - no monkey work!

Every database should really have audit columns in it - even if your users say they are not required. A simple way to get you up and running (schema-wise) is to use the following Microsoft SQL Server script to generate a text file to add your standard audit columns to every table.

Open a query on your database in SQL 2005 Management Studio. Change your output mode to "text", and then run it by pressing F5. The output window will have your script to run for all tables in your database - table auditing heaven!



SELECT 'ALTER TABLE ' + TABLE_NAME + ' ADD Modified datetime NOT NULL DEFAULT GETDATE(),
ModifiedBy nvarchar(255) NOT NULL DEFAULT USER_NAME(),
Created datetime NOT NULL DEFAULT GETDATE(),
CreatedBy nvarchar(255) NOT NULL DEFAULT USER_NAME() '
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
AND TABLE_NAME <> 'sysdiagrams'

Friday, 9 May 2008

Resetting an Identity Field to 0 when testing a data migration process

I currently have a SSIS package that imports data from Excel by flushing data out of the existing SQL tables and grabbing the new values from the source spreadsheet. I noticed that while testing these scripts, my identity seed value was ominously creeping up. When I moved this to live after testing the migration, this would mean the identity values would be artificially large.

Never fear - It IS possible to reset the identity seed value by using the "DBCC CHECKIDENT" command. For example, to reset a table so that the next identity values is 1, you can run the following as part of your data migration script:

DBCC CHECKIDENT('TableName', RESEED, 0)

This will force the next value generated to be 1. Also note that the DBCC CHECKIDENT command even works with fully qualified names for the tablename parameter. For example:



ALTER PROCEDURE [dbo].[proc_InitializePrimaryTables]
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;

-- Insert statements for procedure here
TRUNCATE TABLE InvestmentManagementDataMigration..OpportunityStatusHistory
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunityStatusHistory', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..OpportunityUserRole
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunityUserRole', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..FundInterest
DBCC CHECKIDENT('InvestmentManagementDataMigration..FundInterest', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..OpportunitySourceContact
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunitySourceContact', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..AssetOwnership
DBCC CHECKIDENT('InvestmentManagementDataMigration..AssetOwnership', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..AssetRelationship
DBCC CHECKIDENT('InvestmentManagementDataMigration..AssetRelationship', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Opportunity
DBCC CHECKIDENT('InvestmentManagementDataMigration..Opportunity', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Contact
DBCC CHECKIDENT('InvestmentManagementDataMigration..Contact', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Company
DBCC CHECKIDENT('InvestmentManagementDataMigration..Company', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Address
DBCC CHECKIDENT('InvestmentManagementDataMigration..Address', RESEED, 0)

END

Trivia:
Just to be tricky, I tried setting the seed to a negative value (-1) just to see what this would do...and the next value generated was 10. I don't know about you, but last time I checked, 10 does not follow -1!

SSIS Excel Data Source Error - “Text was truncated or one or more characters had no match in the target code page.”

I also hit this error at my previous company http://www.ssw.com.au/ when I was doing migration from Excel spreadsheets via SQL Server Integration Services (SSIS) for my client Queensland Water Infrastructure.



As described by my http://www.oakton.com.au/ colleague Craig at http://waxtadpole.wordpress.com/2008/04/28/hello-world/, this issue also came up at our current client "LL". The issue is that the sample taken by the JET driver to determine the input datatype on the Excel spreadsheet is just not large enough. e.g. if you have 100 characters for "Column A" on every row except for line 2000, then it will not be picked up and the data type will be set to Unicode string [DT_WSTR] 255 - it will then fail with a truncation error. Even if you override it in the Advanced tab, SSIS will keep setting the size back to the value of 255. The fix is to either:



  1. Put a large placeholder on the first line of the spreadsheet to force the detection of a larger size field (so it gets promoted to a "Unicode text stream [DT_NTEXT]")


  2. Increase the sample size so that it picks up the correct field size (Recommended). You can increase the number of rows sampled by increasing the value of TypeGuessRows under the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel registry key. By default, this is the ludicrously small value of 8(!) lines. For a more reasonable 100 lines, you can copy the following into a .reg file and double click to update your registry:


Windows Registry Editor Version 5.00[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel]"TypeGuessRows"=dword:00000064



After you do this, it will suddenly realize that the column settings are too small and update the data type to "Unicode text stream [DT_NTEXT]".





Also note that you should close your spreadsheet before opening your SSIS package - otherwise it will wait indefinitely with "Visual Studio is Busy" messages!