Monday, 19 May 2008

Simple Auditing with LINQ to SQL - Date, Time and User Stamps

My previous post was on automatically generating basic auditing fields on SQL Server tables via SQL scripts. Today, I examine the other side of the coin - using the LINQ data context to stamp records with audit fields such as "Modified","ModifiedBy", "Created" and "CreatedBy". These stamps are similar to the functionality provided by the windows file system.

There are many ways you can populate application audit tables or audit fields. These include:
  1. With Triggers (but this relies on the SQL authenication mechanism for accurate user data);
  2. With stored procedures (this also relies on the accuracy of SQL authentication information or you need to append the user name to the parameters of the procedure.)
  3. On the Application Side - Manually coding all service/database calls to append this information.
  4. Use a SQL Server log auditing tool like LogExplorer that tracks the SQL Server transaction log.
  5. Leveraging your data access layer's update point to append this information before the update takes place.

Below you can find my code which takes the last approach - and uses reflection to stamp records with when a record was created/updated and who did the insert/update. This is a simplified alternative to an audit solution (such as http://blog.matthidinger.com/2008/05/09/LINQToSQLAuditTrail.aspx) which has a full audit table and where requirements are just to display who last modified or created a particular record in your application:




/// <summary>
/// Basic Audit User and Date Stamp Functionality
/// </summary>
/// <param name="failureMode"></param>
public override void SubmitChanges(ConflictMode failureMode)
{

//Updates
for (int changeCounter = 0; changeCounter < this.GetChangeSet().Updates.Count; changeCounter++)
{
object modifiedEntity = this.GetChangeSet().Updates[changeCounter];
SetAuditStamp(this, modifiedEntity, ChangeType.Update);
}

//Inserts
for (int changeCounter = 0; changeCounter < this.GetChangeSet().Inserts.Count; changeCounter++)
{
object modifiedEntity = this.GetChangeSet().Inserts[changeCounter];
SetAuditStamp(this, modifiedEntity, ChangeType.Insert);
}

base.SubmitChanges(failureMode);
}

/// <summary>
/// For Inserts or Updates - set the user and date stamps
/// </summary>
/// <param name="context"></param>
/// <param name="modifiedEntity"></param>
private void SetAuditStamp(DataContext context, object modifiedEntity, ChangeType changeType)
{
string userName = System.Threading.Thread.CurrentPrincipal.Identity.Name;
const string Created = "Created", CreatedBy = "CreatedBy",
Modified = "Modified", ModifiedBy = "ModifiedBy";

if (changeType == ChangeType.Insert)
{
SetAuditValue(modifiedEntity, Created, System.DateTime.Now);
SetAuditValue(modifiedEntity, CreatedBy, userName);
}
else if (changeType == ChangeType.Update)
{
SetAuditValue(modifiedEntity, Modified, System.DateTime.Now);
SetAuditValue(modifiedEntity, ModifiedBy, userName);
}
}

/// <summary>
/// The type of modifications
/// </summary>
private enum ChangeType
{
Update,
Insert
}

/// <summary>
/// Set target value if it exists on the object
/// </summary>
/// <param name="modifiedEntity"></param>
/// <param name="fieldName"></param>
/// <param name="propertyValue"></param>
private void SetAuditValue(object modifiedEntity, string fieldName, object propertyValue)
{
if (modifiedEntity.GetType().GetProperty(fieldName) != null) //Set current user and time stamp
{
modifiedEntity.GetType().GetProperty(fieldName).SetValue(modifiedEntity, propertyValue, null);
}
}

Friday, 16 May 2008

Microsoft SQL Server 2005 Express Edition with Advanced Services (aka with Reports) - Now Available

An often requested feature for SQL Server Express Edition is to be able to run Reporting Services reports locally on a development machine/as part of a deployment of reporting services. Up until now, you needed to have a full version of SQL Server running to be able to run Reporting Services reports. This restricted its applicability for packaged product deployment - and is where Crystal has had a lead... until now.

The spiel for this new download is:
"Free to download, free to deploy, and free to redistribute as an embedded part of an application, SQL Server Express with Advanced Services is the fast and easy way to develop and manage data-driven applications with powerful built-in reporting and full-text search functionality."

You can now find the answer to you redistributable reporting and full-text indexing issues right here: http://msdn.microsoft.com/en-us/express/aa718379.aspx

Thursday, 15 May 2008

Automatically Generating Audit Columns for your Database - no monkey work!

Every database should really have audit columns in it - even if your users say they are not required. A simple way to get you up and running (schema-wise) is to use the following Microsoft SQL Server script to generate a text file to add your standard audit columns to every table.

Open a query on your database in SQL 2005 Management Studio. Change your output mode to "text", and then run it by pressing F5. The output window will have your script to run for all tables in your database - table auditing heaven!



SELECT 'ALTER TABLE ' + TABLE_NAME + ' ADD Modified datetime NOT NULL DEFAULT GETDATE(),
ModifiedBy nvarchar(255) NOT NULL DEFAULT USER_NAME(),
Created datetime NOT NULL DEFAULT GETDATE(),
CreatedBy nvarchar(255) NOT NULL DEFAULT USER_NAME() '
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
AND TABLE_NAME <> 'sysdiagrams'

Friday, 9 May 2008

Resetting an Identity Field to 0 when testing a data migration process

I currently have a SSIS package that imports data from Excel by flushing data out of the existing SQL tables and grabbing the new values from the source spreadsheet. I noticed that while testing these scripts, my identity seed value was ominously creeping up. When I moved this to live after testing the migration, this would mean the identity values would be artificially large.

Never fear - It IS possible to reset the identity seed value by using the "DBCC CHECKIDENT" command. For example, to reset a table so that the next identity values is 1, you can run the following as part of your data migration script:

DBCC CHECKIDENT('TableName', RESEED, 0)

This will force the next value generated to be 1. Also note that the DBCC CHECKIDENT command even works with fully qualified names for the tablename parameter. For example:



ALTER PROCEDURE [dbo].[proc_InitializePrimaryTables]
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;

-- Insert statements for procedure here
TRUNCATE TABLE InvestmentManagementDataMigration..OpportunityStatusHistory
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunityStatusHistory', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..OpportunityUserRole
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunityUserRole', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..FundInterest
DBCC CHECKIDENT('InvestmentManagementDataMigration..FundInterest', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..OpportunitySourceContact
DBCC CHECKIDENT('InvestmentManagementDataMigration..OpportunitySourceContact', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..AssetOwnership
DBCC CHECKIDENT('InvestmentManagementDataMigration..AssetOwnership', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..AssetRelationship
DBCC CHECKIDENT('InvestmentManagementDataMigration..AssetRelationship', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Opportunity
DBCC CHECKIDENT('InvestmentManagementDataMigration..Opportunity', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Contact
DBCC CHECKIDENT('InvestmentManagementDataMigration..Contact', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Company
DBCC CHECKIDENT('InvestmentManagementDataMigration..Company', RESEED, 0)
DELETE FROM InvestmentManagementDataMigration..Address
DBCC CHECKIDENT('InvestmentManagementDataMigration..Address', RESEED, 0)

END

Trivia:
Just to be tricky, I tried setting the seed to a negative value (-1) just to see what this would do...and the next value generated was 10. I don't know about you, but last time I checked, 10 does not follow -1!

SSIS Excel Data Source Error - “Text was truncated or one or more characters had no match in the target code page.”

I also hit this error at my previous company http://www.ssw.com.au/ when I was doing migration from Excel spreadsheets via SQL Server Integration Services (SSIS) for my client Queensland Water Infrastructure.



As described by my http://www.oakton.com.au/ colleague Craig at http://waxtadpole.wordpress.com/2008/04/28/hello-world/, this issue also came up at our current client "LL". The issue is that the sample taken by the JET driver to determine the input datatype on the Excel spreadsheet is just not large enough. e.g. if you have 100 characters for "Column A" on every row except for line 2000, then it will not be picked up and the data type will be set to Unicode string [DT_WSTR] 255 - it will then fail with a truncation error. Even if you override it in the Advanced tab, SSIS will keep setting the size back to the value of 255. The fix is to either:



  1. Put a large placeholder on the first line of the spreadsheet to force the detection of a larger size field (so it gets promoted to a "Unicode text stream [DT_NTEXT]")


  2. Increase the sample size so that it picks up the correct field size (Recommended). You can increase the number of rows sampled by increasing the value of TypeGuessRows under the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel registry key. By default, this is the ludicrously small value of 8(!) lines. For a more reasonable 100 lines, you can copy the following into a .reg file and double click to update your registry:


Windows Registry Editor Version 5.00[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel]"TypeGuessRows"=dword:00000064



After you do this, it will suddenly realize that the column settings are too small and update the data type to "Unicode text stream [DT_NTEXT]".





Also note that you should close your spreadsheet before opening your SSIS package - otherwise it will wait indefinitely with "Visual Studio is Busy" messages!