Tuesday, March 20, 2012

Post processing data flow destinations

What I am trying to do is move data from a staging table into a live environment and then update the staging table AFTER the row has move (and not errored). There does not seem to be a reliable method for doing this.

Any assistance would be appreciated.

Brett Robson wrote:

What I am trying to do is move data from a staging table into a live environment and then update the staging table AFTER the row has move (and not errored). There does not seem to be a reliable method for doing this.

Any assistance would be appreciated.

What specifically isn't reliable? It sounds to me like you need a data-flow to move the data and an Execute SQL Task to issue the UPDATE. That should work OK.

-Jamie

|||

Jamie,

I don't think your quite following what I am trying to do. I want to update the SOURCE table after the rows have moved. I don't want to update the ones which error. How does the Execute SQL Task get the appropriate row ids? They certainly won't be passed from the data-flow.

Regards,

BRETT ROBSON

|||

Brett Robson wrote:

Jamie,

I don't think your quite following what I am trying to do. I want to update the SOURCE table after the rows have moved. I don't want to update the ones which error. How does the Execute SQL Task get the appropriate row ids? They certainly won't be passed from the data-flow.

Regards,

BRETT ROBSON

Oh OK. I gotcha.

You're right, Execute SQL Task is not an option. You can execute your UPDATE from the data-flow though using an OLE DB Command component.

This will work fine - the only problem with it is that it issues an UPDATE for every row - and ths isn't very performant. Another option is to push all the successful rows into another table and update the source table from this new table. This update can be issued from the Execute SQL Task and will be much more performant because you're only doing it once.

-Jamie

|||

Thanks Jamie - either of those two should work.

No comments:

Post a Comment