Do you have to set the Error Output on DF components to "Fail Component" in order to get the errors?
What I would LIKE to do is a combination of "Ignore Failure" and "Fail Component". You see, I am using the Logging feature in my package that creates the sysdtslog90 table in the SQL database. The errors that I am logging make sense and have enough information for my purposes.
The problem is that I would like to continue processing the data and not have it stop when a data error occurs. I REALLY do not want to Redirect Rows unless it is necessary for me to do what I am asking.
Using Ignore Failure on both the source text file and destination SQL table allows the "good" data to be inserted, but I cannot get any info on the columns in error. Conversely, if I choose to Fail component, I get the info on the columns in error, but only the data that was inserted before the error was encountered is inserted into the table.
Suggestions?
You can't "ignore failure" and expect to obtain error information on columns. After all, you're ignoring the error.|||
ronemac wrote:
The problem is that I would like to continue processing the data and not have it stop when a data error occurs. I REALLY do not want to Redirect Rows unless it is necessary for me to do what I am asking.
Redirect Rows is the only way to capture error information about the rows, and still be able to send them to a destination.
|||And that makes complete sense.. What I am looking for is an option to fail component, but insert the "good data". I don't care that the component fails - just as long as the "good" data makes it. I can then use the errors to let the data creator know that there were issues with some of the data - and give them an idea of what the bad data is.
I am guessing there is no such option since you did not reply with a suggestion for me?
|||That being said....
I am trying to use the Fast Load option, and I noticed that you cannot redirect rows with Fast Load. All of the things I read pointed to using Fast Load to avoid the SQL logging - using Fast Load is similar to Bulk Insert from what I gather. True statement? Other suggestions?
Thanks
|||
ronemac wrote:
That being said....
I am trying to use the Fast Load option, and I noticed that you cannot redirect rows with Fast Load. All of the things I read pointed to using Fast Load to avoid the SQL logging - using Fast Load is similar to Bulk Insert from what I gather. True statement? Other suggestions?
Thanks
True. However, you can redirect rows -- sort of. Using the fast load option and redirect error rows, if an error is encountered, then the whole batch (as determined by MaxInsertCommitSize) is redirected. From this point, you could hook the error flow up to another OLE DB destination, this time not using fast load. Coming out of here, you can capture the error row. This will help to speed up processing, depending on the size of the batch that gets redirected.
Or, you build in error handling upstream. For instance, if you errors are mostly referential integrity errors, you can use a lookup upstream to see if the incoming record exists in the destination or not. If it doesn't, send it to the OLE DB Destination. If it does, either update it or throw it away.|||
Okay, just one more question... Is it the Source (text file) or Destination (SQL table) where I want to redriect the rows?
The errors I have run into so far (and probably will in the futre) are some of the data types FROM THE SOURCE do not match or are not valid based on the SQL data type.
Thanks again.
|||You may want to redirect both. It's up to you on which errors you want to capture. Errors that happen when the record is attempting to be inserted should be caught with an error redirect on the OLE DB destination, for instance.|||In general, I think the earlier you can eliminate or redirect rows, the better. So my first thought would be eliminate it at the source. Why do any more processing of a row that's in error than you absolutely have to?|||I agree. Thanks|||
jwelch wrote:
In general, I think the earlier you can eliminate or redirect rows, the better. So my first thought would be eliminate it at the source. Why do any more processing of a row that's in error than you absolutely have to?
Except that it may not be in error at the source.|||
Well, I thought I was done here... This is a pretty simple quesiton that you may know the answer to.
These are 2 of the errors I took out of the Progress list after the package has run and I have chosen to Fail Component (Instead of Redirect Rows) on the SQL destination DF component.
[Destination - Student [49]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Invalid character value for cast specification".
[Destination - Student [49]] Error: There was an error with input column "STU_FREE_RDUC_LNCH" (910) on input "Destination Input" (62). The column status returned was: "The value could not be converted because of a potential loss of data.".
So, if I redirect the rows (instead of Fail Component) and throw in a script task that sets some outputs for me to log - to include the error description - it works great and I do log the error description that is the same as sown in the first error above.
Can I not capture that second error from the 2 errors above when I redirect rows? Isn't there something that will tell me WHAT caused the row to fail?
Life would be SO MUCH easier if I could just get that column name when I redirect the rows.
Thanks
|||The reason for this thread has to do with me using the built in SQL logging (to sysdtslog90) and then querying this table to get "meaningful" information.
I found a sort-of-workaround for this issue.
Basically, what I am doing is setting the Error Output on the source file to Ignore Failure so that the "good" data loads and on the OLE DB Destination I am setting the Error Output to Redirect Row to a second OLE DB Destination that is mapped to the same table (knowing the rows will fail). I am then setting the Error Output to Fail Component on the second OLE DB Destination. This is where I get the column error message
Doing this will allow me to capture the number of rows processed in the source, the number wrote to the destination, and most importantly it will give me the column name that caused the error...
Kind of ugly, but it gives me what I want.
No comments:
Post a Comment