Saturday, February 25, 2012

OLE DB Command and Destination writing to the same table

Hi,

I have a data flow task that performs an "upsert" by directing successful rows from a Lookup to an OLE DB Command that updates rows and unsuccessful rows (Lookup error output) to an OLE DB Destination for insertion.

The problem is that execution hangs when both tasks update/insert into the same table (execution is still hung after 20 minutes). Modifying the OLE DB Destination to insert into a different table succeeds (execution completese within 2 minutes). Replacing the OLE DB Destination with a Row Count transformation also works.

Could this be due to a table-locking issue? Any suggestions?

Thanks
ray

Just to confirm you haven't set the table lock check box on the oledb destination?

|||

You might look at the slowly changing dimension task... I'm not that familiar witrh it.

A quick and easy solution would send one of the outputs, probably the updates to a raw file destination. Then read that raw file in a different data flow task and perform the update there.

Doing a lot of updates in a sqlCommand - row by row - can be time consuming and I usually try to find another way to do it using some kind of set based operation. For example, if I can identify unique properties of from the source data, a date range or a specific field value; I would use that to construct a sqlCommand that would update all of the records at one time.

|||

Here is 2 links to pages that talk about different techniques for doing an "upsert". This is the

first time I have heard that term. I like it!

This is my web site:
http://www.mathgv.com/sql2005docs/SSISTransformScriptETL.htm

This is Jamie Thompson's cool

blog:
http://blogs.conchango.com/jamiethomson/archive/2006/09/12/SSIS_3A00_-Checking-if-a-row-exists-and-if-it-does_2C00_-has-it-changed.aspx


Thanks,

Greg Van

Mullem

|||

rlee1003 wrote:

Hi,

I have a data flow task that performs an "upsert" by directing successful rows from a Lookup to an OLE DB Command that updates rows and unsuccessful rows (Lookup error output) to an OLE DB Destination for insertion.

The problem is that execution hangs when both tasks update/insert into the same table (execution is still hung after 20 minutes). Modifying the OLE DB Destination to insert into a different table succeeds (execution completese within 2 minutes). Replacing the OLE DB Destination with a Row Count transformation also works.

Could this be due to a table-locking issue? Any suggestions?

Thanks
ray

Yes, this is absolutely a locking (actually a blocking) issue. Execute sp_lock or sp_who2 to confirm it.

You can get around this problem by executing the insert and update operations in different data-flows. Use a raw file to pass data from one data-flow to another. This technique is covered (albeit for use in a differrent context) here:

Splitting order detail and order header information from one file into multiple tables
(http://blogs.conchango.com/jamiethomson/archive/2006/05/22/3974.aspx)

-Jamie

|||Oho, I get the question now. My previous reply was useful but a little off topic. Are you use the "Fast Load" option in your OLE DB Destination? Turning this off might help.

You could also uses 2 "OLE DB Command" components (running insert and update commands). That's what I'm doing so I know it works quite well.

Thanks,
Greg

No comments:

Post a Comment