Some SQL, some .NET, and whatever else


Calculating working hours between two dates

As a follow-up to an earlier post (Return a list of all dates between a start and end date), I need to find the number of working hours between two timestamps - in this case, it was to see how long a support ticket had been open before it was initially assigned, but the user didn't want non-work hours to count against them.

To do this, I used the previous script to generate a list of dates and hours, and then marked the rows as work time or not (based on day of week and hour of day, evaluated together). The result was a table that would effectively let me do a SUM to find the value I was looking for. Once I had that table, I could join to it for rows between the two datetimes in question and SUM up rows that had "WorkTime" marked:

SELECT tt.TicketNumber,
       SUM(  CONVERT(INT, wh.WorkTime)) as WorkHoursBeforeAssigned
       COUNT(CONVERT(INT, wh.WorkTime)) as TotalHoursBeforeAssigned
  FROM TroubleTickets tt
  JOIN #WorkingHours wh
    ON wh.EvaluateTime BETWEEN tt.TicketCreateTime
                           AND tt.TicketAssignTime
GROUP BY tt.TicketNumber,

In this case, tickets that were created and picked up after hours, without passing any worktime, would show as zero hours old (as they should, since they were interested in working time) - however, I've also included COUNT here to show total hours as well as work hours.

Also, this script only counts for raw day-of-week and hour-of-day working time - it ignores holidays and other special circumstances. I have a script that tracks holidays (American ones, at least), and I'll put that up shortly as well - if you want to take holidays into account, you could incorporate that into your evaluation.

Here's the script that builds the working time table (you can also download it here):

-- Set things up before we get started
		@WorkTimeEnd		TINYINT,
		@WorkDayOfWeekStart	TINYINT,
		@WorkDayOfWeekEnd	TINYINT

		@EndDate			DATETIME

CREATE TABLE #WorkingHours (
		EvaluateTime	DATETIME,
		IsWorktime		BIT DEFAULT(0)


	SET @WorkTimeStart = 7  --7AM
	SET @WorkTimeEnd   = 16 --4PM hour (4-5PM count as working)
	SET @WorkDayOfWeekStart = 2 --Monday
	SET @WorkDayOfWeekEnd   = 6 --Friday

	SET @StartDate	= '2000-01-01 00:00:00'
	SET @EndDate	= '2020-12-31 23:59:59'


-- Built the list of timestamps we're working with
;WITH numberlist(number)
   AS (SELECT RANK() over(order by c1.object_id,
		 from sys.columns c1
         join sys.columns c2)
INSERT INTO #WorkingHours (EvaluateTime)
SELECT DATEADD(hh, number-1, @StartDate)
  FROM numberlist
 WHERE DATEADD(hh, number-1, @StartDate) <= @EndDate

-- Set the times to worktime if they match criteria
UPDATE #WorkingHours
   SET IsWorktime = CASE WHEN (DATEPART(dw, EvaluateTime)
								BETWEEN @WorkDayOfWeekStart
								AND @WorkDayOfWeekEnd)
							  (DATEPART(hh, EvaluateTime)
							   BETWEEN @WorkTimeStart
							   AND @WorkTimeEnd) THEN 1
						 ELSE 0

-- Retun the results
 SELECT * FROM #WorkingHours
 ORDER BY EvaluateTime

 DROP TABLE #WorkingHours

Clean up vendor names and other data with unwanted numbers/characters

In creating an accounting report, the vendor names we get back from our credit card processor needed some major clean-up: "52334SOUTHWESTAIR1234", "ABD2343-BLUE DINER 843", and so on. I initially found a great function for this from Pinal Dave:

But I wanted to make a few enhancements to it:

  1. He leaves numbers in the string and I'd like to remove them
  2. I'd like to combine multiple spaces in a row into a single space, but leave spaces intact

The first is pretty easy to do - just remove the [0-9] and add a space to the PATINDEX. The second one uses a trick from another post I did a few years ago.

Here's the modified version:

CREATE FUNCTION dbo.UDF_ParseAlphaChars2
   @string VARCHAR(8000)
   DECLARE @IncorrectCharLoc SMALLINT
   SET @IncorrectCharLoc = PATINDEX('%[^ A-Za-z]%', @string)

   WHILE @IncorrectCharLoc > 0
      SET @string = STUFF(@string, @IncorrectCharLoc, 1, '')
      SET @IncorrectCharLoc = PATINDEX('%[^ A-Za-z]%', @string)

   -- Trim groups of spaces into single space
   SET @string = LTRIM(RTRIM(REPLACE(REPLACE(REPLACE(@string,' ','<>'),'><',''),'<>',' ')))

   RETURN @string

SELECT dbo.UDF_ParseAlphaChars2('ABC”_I+{D[]}4|:e;””5,<.F>/?6')
SELECT dbo.UDF_ParseAlphaChars2('52334SOUTHWESTAIR1234')
SELECT dbo.UDF_ParseAlphaChars2('ABD2343-BLUE DINER 843')

Export from SQL Server to XLS and email results

Sometimes you want to take some query results and export them directly to an XLS file - here's how you can set that up in SQL Server. The biggest caveat is that you need to run it from an x86 instance of SQL Server - the x64 instance won't have access to the Jet driver needed to write the Excel file (Microsoft.Jet.OLEDB.4.0), where the x86 version will. In fact, we maintain an older x86 instance of SQL Server for random processes like this that need it - x64 is better in almost every case, but we can't see to completely ditch x86... 🙂

I use a stored proc that I call from a SQL Agent Job, which works great. The actual process is a bit awkward - for starters, you'll need access to xp_cmdshell. SQL Server can't create a new Excel file from scratch, so you have to keep a blank Excel file around, make a copy of it, and then insert into the copy to get your final result.

That said, here's the code to generate the XLS file from your query results:

SELECT Column1, Column2, Column3, Column4
  INTO ##YourTempTable
  FROM SomeOtherTable

SET @Folder = 'C:\Temp\'
SET @DocumentBlank = 'Your Document - Blank'
SET @DocumentLong = 'Your Document - ' + CONVERT(VARCHAR(10), GETDATE(), 120)

SET @CMD = 'COPY "' + @folder + @DocumentBlank + '.xls" "' + @Folder + @DocumentLong + '.xls"'
exec master..xp_cmdshell @CMD

-- Export the Excel sheet
SET @CMD = 'insert into OPENROWSET(''Microsoft.Jet.OLEDB.4.0'',
	''Excel 8.0;Database=' + @Folder + @DocumentLong + '.xls;'',
	''SELECT * FROM [Sheet1$]'')
	select Column1, Column2, Column3, Column4 from ##YourTempTable'

exec sp_executesql @CMD

Once that's exported, you can just set up the email process using sp_send_dbmail and attach the file you just generated:


SET @Attachments = @Folder + @DocumentLong  + '.xls'
SET @Body = 'Your file has been generated for ' + CONVERT(VARCHAR(10), GETDATE(), 120)

exec msdb..sp_send_dbmail @profile_name = 'YourMailProfile',
	@Recipients = '',
	@subject = 'Your file is ready',
	@Body = @Body,
	@file_attachments = @DocumentLong

Return a list of all dates between a start and end date

In some situations, you'll need to work with a list of numbers or dates that are between some start or end, and you don't have a complete list handy for joining to. Using the script below, you can create one to use in your query - if you wanted to use this in-line in another SQL Statement (and assuming you're using SQL 2005+), you could either do a subquery or a WITH CTE clause and then join directly to it.

To do it, I'm using a table that has a large number of rows in it, even in an empty database (sys.columns), and then doing a cross-join to ensure that I'll have enough rows to satisfy my entire range. This table has about 890 rows in an empty databases (or you can use the "model" database if you don't have any user databases handy), meaning that the cross join yields about 800,000 rows - enough for almost 2200 years of days, or 100 years of hourly increments (change the "dd" in the "DATEADD" statements below to "hh" or even "mi" to do any increment of time you want).

The code:

	    @EndDate   DATETIME
	SET @StartDate = '2012-12-01'
	SET @EndDate   = '2015-12-31'
;WITH numberlist(number)
   AS (SELECT RANK() over(order by c1.object_id,
	     from sys.columns c1
	     join sys.columns c2)
SELECT DATEADD(dd, number-1, @StartDate)
  FROM numberlist
 WHERE DATEADD(dd, number-1, @StartDate) <= @EndDate

I'm using dates above, but if you wanted to use INT instead, it's pretty straightforward:

		@End   INT
	SET @Start = 1500
	SET @End   = 64000
;WITH numberlist(number)
   AS (SELECT RANK() over(order by c1.object_id,
	     from sys.columns c1
	     join sys.columns c2)
SELECT @Start + number - 1
  FROM numberlist
 WHERE @Start + number - 1 <= @End

Roll your own lightweight SQL Server source control

I've wanted to implement some kind of source control on my SQL Servers before, but the only product available at the moment is Red-Gate's SQL Source Control, and I didn't need all the functionality it offered (or want to pay for it). Also, it relies on developers checking-in their changes, and that's prone to forgetfulness anyways, as well as leaving your database prone when somebody just changes something in production, without using their development tool - ouch. Sure,  you're protected against accidental drops, but what if somebody tweaks something in production without checking it back in? You're hosed.

All I wanted was a simple process that would run automatically, taking periodic snapshots of the database objects and recording any changes. I decided to roll my own - it's quick, simple, can be set up to run on a schedule, and automatically includes any new databases created on the server without any intervention.

This Stored Procedure goes through the following steps:

  1. If the Master.dbo.coSourceControl table (used to store the history) doesn't exist, it creates it
  2. For each database on the server (so new databases are added automatically), it:
    1. Grabs the text contents of all the user objects (not flagged as "IsMsShipped")
    2. Compares the contents of each to the last known copy (if there is one)
    3. If the object is new or has changed, add a new copy to the source control table in master
  3. Output the number of objects updated
  4. Optionally, it could email somebody to tell them about the results, but it currently does not

The history is kept in a single table - master.dbo.coSourceControl - which has the database it came from, the object_id, the object name, object contents, and the timestamp. Since it uses the object_id to track things, it will also record a name change in an object, even if the contents didn't change.

To implement it, just grab the script and run it in the master database - it will create the stored procedure coSourceControlRefresh. That's it - now either run it on demand, or you can schedule it. It will create the supporting table (if it's missing) and scan every database every time it's run. To see the history for an object, just do:

  SELECT db_name(databaseid) as [Database],
         object_name(objectid) as [Object Name],
    FROM master.dbo.coSourceControl
   WHERE object_name(objectid) LIKE '%The name of some object%'

Restoring a dropped or changed database object should be as simple as running the query above, grabbing the contents of ObjectText you're interested in, and then pasting it in another window and executing it. Bam - previous version of the object restored (and this stored proc should, the next time it runs, see that you've altered the object and record that there's a "new" version of it).

If you run it and like it - or don't like it - please leave a comment to let me know - nothing expected in return, but it's nice to know when people find it useful. I'm happy to make any enhancements you'd like to see. I hope you enjoy it and it's able to save you from the headache of a dropped database object to which you can't find the source!

Download the Source Control database script


Lightweight, single-row alternative to OUTPUT clause in T-SQL

SQL Server 2005 adds the option for an OUTPUT clause in your query to act upon table rows and return the old and new values. When I've done queuing in the past, I've used the clause to mark a row as processing and return the value, all in a single operation, so it's lightweight and threadsafe. For example, like this:

UPDATE TOP (1) dbo.MyQueue
   SET ClaimedBy = @Server,
       ClaimTime = @ClaimTime
  INTO #OutputTable (QueueID, Column1, Column2, Column3)
 WHERE Some Criteria...

To do this, you'll need to create a table called #OutputTable that has the right schema, which works well if you're returning multiple rows from your query, but is a little cumbersome to work with if you're only doing one row at a time. If you're only returning a single row from your UPDATE query (as I am here), there's an alternative to OUTPUT that's easier to use - just do variable assignment inline in the UPDATE statement! The query above becomes:

UPDATE TOP (1) dbo.MyQueue
   SET ClaimedBy = @Server,
       ClaimTime = @ClaimTime
       @QueueID = QueueID,
       @OutputVar1 = SomeData1,
       @OutputVar2 = SomeData2,
       @OutputVar3 = SomeData3
 WHERE Some Criteria...

Notice the reversed variable assignment in the second query? I've done away with my table, and my OUTPUT clause, and now I just have the relevant values from the row I'm interested in. Much easier to work with, and as an added bonus (though I hope you're not in this situation), it works just fine in SQL 2000.

The caveat is that it's only good for a single row, and it only works for UPDATE - if you're using DELETE, you'll still need the temp table and an OUTPUT clause.

Tagged as: No Comments

Removing an arbitrary number of spaces from a string in SQL Server

When I was concatenating some fields to create an address, I ended up with a number of spaces in my address that were unnecessary. For example, I had this:

SELECT StreetNumber + ' ' + Direction + ' ' + StreetName + ' ' + StreetType as Address

However, when an address didn't have a direction, I ended up with a double-space in the middle of my address, and I wanted a way to clean it up. Enter the code I found at

SELECT name,
       REPLACE(REPLACE(REPLACE(name,' ','<>'),'><',''),'<>',' ')
  FROM SomeTable

This shortens any run of spaces in the string into a single space - sneaky! It works in any language that supports a function like REPLACE, which scans one string for instances of a second string, and swaps them out for something else.

Tagged as: No Comments

Migrate database indexes to a new file group

I recently had to mass-migrate all the indexes from a database to a new file group since we'd added some additional storage to our database server. I found this article at SQL Server Central (unfortunately, registration required, so I've included a copy of the original script in the download at the end). While it worked okay, there were some things I didn't like about it:

  • Assumed 90% index fill rate
  • "Moved" indexes were all created as non-unique, regardless of original
  • Fail during index creation left you without an index (drop and then create, with no rollback)
  • Table was un-indexed during the move (index dropped and then created)
  • Script re-created indexes without any "Included" columns, even if original index had them

To address these limitations, I rebuilt the process using that script as a starting point. The new script:

  • Uses 90% fill rate by default, but if the original index had a different rate specified, it will use that
  • Re-creates indexes as unique if the source index was unique
  • Rollback problem resolved - new index is created with different name, old index is dropped, and then new index is renamed, all in a TRY-CATCH block
  • Since the new index is created and then the old one dropped, table indexing remains "online" during the move
  • Migrates "Included" columns in index
  • Updated the script to use SYS views (breaks compatibility with SQL 2000, since SYS is 2005/2008/beyond only)
I welcome any feedback on the script, and would love to know if you see any improvements that should be made.

Download .SQL scripts (contains both Original and Modified scripts)


Finding unused tables in SQL Server 2005 and 2008

Recently, I was tasked with "cleaning up" a very large database on our network - it included hundreds of tables with cryptic names, and I wasn't able to tell which ones were still being used and which weren't. There are triggers for INSERT, UPDATE, and DELETE, but no trigger for SELECT, and that's what I wanted.

However, SQL Server 2005 and later provide something that's almost as good - the sys.dm_db_index_usage_stats system view. This view has table and index statistics for every table in the database and you can use it to determine when a table was last accessed. Though I initially thought this table only contained index stats, so would be useless against tables without indexes, that's not the case; it contains tables themselves as well, and calls them "HEAP" indexes. This way, you can see which tables are being scanned against often (a sign that a better set of indexes is needed), or which indexes aren't being accessed at all and can safely be removed.

Using this data, it's easy to determine which tables haven't been accessed since the server was last restarted:

WITH LastActivity (ObjectID, LastAction) AS
  SELECT object_id AS TableName,
         last_user_seek as LastAction
    FROM sys.dm_db_index_usage_stats u
   WHERE database_id = db_id(db_name())
  SELECT object_id AS TableName,
         last_user_scan as LastAction
    FROM sys.dm_db_index_usage_stats u
   WHERE database_id = db_id(db_name())
  SELECT object_id AS TableName,
         last_user_lookup as LastAction
    FROM sys.dm_db_index_usage_stats u
   WHERE database_id = db_id(db_name())
  SELECT OBJECT_NAME(so.object_id) AS TableName,
         MAX(la.LastAction) as LastSelect
    FROM sys.objects so
    JOIN LastActivity la
      ON so.object_id = la.ObjectID
   WHERE so.type = 'U'
     AND so.object_id &gt; 100
GROUP BY OBJECT_NAME(so.object_id)
ORDER BY OBJECT_NAME(so.object_id)

Since the table is cleared when the SQL service restarts, this will only display the tables not accessed since the last time the server was restarted. Because of this, you'll need to ensure that the SQL Server has been running for sufficiently long before you rely on this query to see which tables aren't accessed by users.

Keep in mind that, even if the server has been running for months and a table is still in this list, it may not be safe to delete it. Some tables may be part of year-end or rare processes. This list should be used as a guide to help you figure out what's safe to delete, and you may even consider renaming objects for a while first, so that any processes that do end up relying on one of these tables can be easily corrected by renaming the objects back.

Tagged as: No Comments

Moving a SQL Server database to another server on a schedule – without using replication

Recently, I had the need to copy a set of databases from a dozen remote servers to a central server, restore them, and have it happen automatically, with no intervention from me at all. Replication wouldn't work for the following reasons:

  1. Many tables didn't have primary keys, so merge replication was out (even though this was only one-way replication)
  2. The size of the databases (28GB in one instance) and the quality/speed of the WAN removed the log shipping option
  3. There's too much activity to consider any kind of live replication

Given our restrictions, we decided to go the following route. On the remote server, we set up a batch file that did the following:

  1. Use OSQL to back up the databases in question to a folder
  2. Run 7Zip from the command line to compress the backups into separate archives. For each auto-attaching later, each archive had the name we wanted it attached to the remote server with (for example, Site1ProdDB was backed up to Site1ProdDB.BAK, then compressed to Site1ProdDB.7z)
  3. Delete the BAK files
  4. Archives were renamed from *.7z to *.7zz (this is important - I'll explain why in the server part)
  5. Scripted FTP using Windows command line FTP tool to a folder on our central collection server
  6. Once the FTP was complete, rename the archives on the remote server back from *.7zz to *.7z
  7. Delete the local *.7zz files

That's it for the client - the BAT file was scheduled as a SQL Agent job so that we could kick it off remotely from any site we wanted, or so we could set them up on a schedule. Then, we put a BAT file on the server that did the following:

  1. Check folder for files that match *.7z
  2. For each one found, do the following:
    1. Extract it to a "Staging" folder
    2. Delete the 7z file for that archive
    3. Use OSQL to restore the file from the command line
    4. Use OSQL to run a script that changes the DB owner, adds some user permissions, and generally does some housework on the database
    5. Use an SMTP tool to send a email notice that the backup has been restored
  3. Repeat step 2 for every .7z file in the folder
  4. As a second step in the SQL Agent job, run "MoveLog.bat" (included below) to finish rotating the logs - it ensures that only logs with meaningful information are kept

The server BAT process can run as often as desired - in our case, we run it every 30 minutes, so the backup will be picked up and restored as soon as it's available. That's where the rename from the client side comes into play: If the files were named Database.7z, then the server process would attempt to pick them up while they're being uploaded via FTP, and shenanigans would ensue. By renaming them when they're done uploading, they become immediately available for restoring on the server side.

As I said before, I scheduled both the client (source) and the server (restore/destination) process as SQL Agent jobs - the Windows scheduler is too cumbersome to work with remotely, and kicking them off on demand was a pain. With the SQL Agent, they can be started on demand, and then I get an email notification as soon as they've been successfully restored.

I've attached the files below, and I welcome any feedback that you have or any improvements that can be made - I'm happy to give you credit and post a new version here. Specifically, I'm interested in any feedback about how to make this process more dynamic - I know BAT scripting supports FOR EACH and wildcards, but I was unable to make them work properly with OSQL, so I'd appreciate any input there. Enjoy!

Download the ZIP archive containing the files for this post

Tagged as: 3 Comments