While I was downloading the latest version of XCode on my home machine (an old Mac Cube), I ran out of disk space. As I didn’t feel like upgrading the disk (even if I would, no stores are open at this time), I decided I’d better hunt for those files that were taking up all that diskspace. On my PC, I normally use JDiskReport from JGoodies for that, but for some reason, it didn’t want to scan my disk. However, Google found a mac specific app that seems to do the trick: Disk Inventory X. A few minutes after installing it, I’ve already managed to remove 2.5Gb of old data…

Advertisements

To check whether a table already contains an identity column (mostly useful in change scripts), use the following sql code:


SELECT *
FROM information_schema.tables
WHERE table_name = 'your_table_name' AND
OBJECTPROPERTY(OBJECT_ID(TABLE_NAME), 'TableHasIdentity') = 0

On my current project, we’ve been running out of disk space on our development servers. The culprit is SQL Server’s tempdb. During installation, it was left on the c: drive, which is of course our smallest disk. User databases can easily be moved via detach/attach, but this doesn’t work for the tempdb. To move the tempdb, another solution exists. First, use


use tempdb
go
sp_helpfile
go

to see where the tempdb files are located at the moment (you’ll need this path in a moment). Move them to another place with


use master
go
Alter database tempdb modify file (name = tempdev, filename = 'd:\databases\tempdb.mdf')
go
Alter database tempdb modify file (name = templog, filename = 'e:\database_logs\templog.ldf')
go

and restart SQL Server. However, I noticed that even after the restart, the c: disk was still being used for the tempdb! The solution turned out to be to delete the tempdb.mdf and templog.ldf files on the c: disk (see the paths returned by sp_helpfile before the move).

The last couple of weeks, I found some blogs with just a list of cool software (in various categories: development tools, utilities, …). I’ll list just a few of them:

I already knew (or had heard of) a lot of the tools in these lists, but I still managed to find a few gems:

  • synergy – share your mouse/keyboard between multiple systems, even cross-platform (without a KVM switch, that is). I’ve just installed this on both my windows laptop and my mac, and if I move the mouse of my PC to the right side of the screen, it appears at the left side of my mac! Everyone who as ever sat in front of a desk with two PC’s (with their own keyboard and mouse) inevitably has encountered a situation where you wished you could just copy-paste between the two system. Well, with synergy you can! Very cool…
  • Colibri – Quicksilver (from the mac) but then for PC. Launch *any* application with just a few keystrokes. Like the GUI but miss the shell? Colibri is best-of-both-worlds!

It’s been a long time since the last post… I’m sure this is something every blogger encounters: when you first start with a blog, it’s full of enthousiasm and you promise to write as often as you can…

But then, in comes life 🙂 So, after a honeymoon (yeh, I got married!) and returning to a project in the middle of ‘dead-line month’, I haven’t really had the time to write anything here…

After being hit with numerous collation problems on a previous project, I’ve been very careful about never explicitly stating the collation in my SQL scripts. I just make sure that the database default collation is set correctly, and all tables automatically have the correct collation. Since then, I’ve managed to avoid any collation problems. Until yesterday that is! Some background: I managed to mess up the collation when I was installing SQL Server 2005 on my laptop some months ago. As a result, most of my databases have a different collation than the server they are running on. Normally, this isn’t a problem, but you should be careful when you’re working with temp tables.

When you explicitly create the temp table

CREATE TABLE #MyTempTable (
name varchar(30),
firstname varchar(20)
)

INSERT INTO #MyTempTable(name, firstname)
SELECT name, firstName
FROM somewhere

the table will use the collation of the tempdb (where the table is actually created) not that of your current database! Typically, the collation of tempdb is that of the database server, so in my case, where the database collation differs from the server collation, trying to insert entries from the temp table into another table:

INSERT INTO Person(name, firstName)
SELECT name, firstName
FROM #MyTempTable
EXCEPT
SELECT name, firstName
FROM Person

results in the following error

Cannot resolve the collation conflict between "Latin1_General_CI_AS" and "SQL_Latin1_General_CP1_CI_AS" in the EXCEPT operation.

The easiest way to fix this is to let SQL Server create the temp table for you via a SELECT INTO statement:

SELECT name, firstName
FROM somewhere
INTO #MyTempTable

The temp table will still be created in tempdb, but it’s fields will take the collation from their source (in this case, the fields in the <somewhere> table.

Of course, the SELECT INTO statement has the additional benefit that you don’t have to keep the temp table structure up-to-date manually!

On my current project, we’re using SSIS quite extensively for uploading data (text files) from the mainframe to our database. Although the body of the work is mostly finished, I still had to bundle all the work in an easy to run package. Until now, I had always manually run all the different packages that are required for the upload. So last week, I spent some time creating a ‘master’ package that calls all the (child) packages in the correct order. I first edited all my child packages to use parent configuration: this way, I can specify the configuration at the top-level, and all child-packages pick up this setting automatically (and I still can run the packages by themselves if I want). I then created the parent package: it consists of a number of ExecutePackage tasks, which just pass control the configured child package. I thought that all of this could be completed in a few hours, but alas, when I tried to run the parent package, I almost immediately got the following error:

Error: Error 0xC0012050 while loading package file “<SomeChildPackage>.dtsx”. Package failed validation from the ExecutePackage task. The package cannot run. .

I hoped the child package’s output would show more information on what went wrong, but nope, nothing. After some experiments, I found out that I could get things to work by setting the ExecuteOutOfProcess property of the ExecutePackageTask to true. This forces SSIS to create a separate process for each child package to run in. However I noticed that these processes are not cleaned up quickly: several minutes after the parent package had completed, I still saw several dthost.exe process running, each taking a considerable amount of memory. Btw, I half expected this to be the case, as I remembered this post.

So, although I now had a work-around, I still was confused as to why my packages wouldn’t work if I ran them in the same process as their parent. I didn’t want to spend more time on it, so I went to create some other parent packages (for other parts of the project). I tried running one before I had set the ExecuteOutOfProcess property of each child-package, and behold, some (but not all) of the child-packages did run! I compared the properties of the child packages (both of the ExecutePackage task as of the child package itself), and found that the only difference between a working and a non-working child package was that the DelayValidation property of the non-working ones was set to true! Changing it to false solved the problem!

So, the short version of this is: if you get the above error when running a child-package with the ExecutePackage task, make sure that DelayValidation is set to false on the child-package definition (the package itself, not the ExecutePackageTask)!