Showing posts with label import. Show all posts
Showing posts with label import. Show all posts

Friday, March 30, 2012

Log parser

Can someone send me the command to import the application and system log of
the eventlogs into SQL Server for say 2 different servers ?
Does the command have to run every few mins ? How does it handle
duplicates,etc. ? I would like to have the logs in the database no later
than say 15 mins from the time they make it in the respective log. But I
dont know how log parser is smart enough to do that unless I run it every 5
mins but then again, what does it scan for and only ensures it does not
insert duplicates
Thanks
You have to write something custom to do this. You can do it with an SSIS
WMI data reader task. The class to look at is win32_ntlogevent.
Other options include powershell to from xp_cmdshell or a CLR function to
get to WMI.
Jason Massie
http://statisticsio.com
"Hassan" <hassan@.test.com> wrote in message
news:ut0pOZvKIHA.5360@.TK2MSFTNGP03.phx.gbl...
> Can someone send me the command to import the application and system log
> of the eventlogs into SQL Server for say 2 different servers ?
> Does the command have to run every few mins ? How does it handle
> duplicates,etc. ? I would like to have the logs in the database no later
> than say 15 mins from the time they make it in the respective log. But I
> dont know how log parser is smart enough to do that unless I run it every
> 5 mins but then again, what does it scan for and only ensures it does not
> insert duplicates
> Thanks
>

Wednesday, March 28, 2012

Log Import Problem using Reporting Services

Hi,
I am creating a report using MSS and the Microsoft utility
MSSLogToDatabase utility. The utility seems to work great until there
is an attempt to specify a subdirectory for the /reportfolder
parameter.
I have searched far and wide in diverse newsgroups, google, microsoft's
site and etc. but can't seem to get a definitive answer as to whether
or not there is a way to have the report created on the Reporting
Server Website that is contained within another folder.
Documentation is very concise and I can't find any additional
information for using the utility. The description for the
/reportfolder:value switch is "Specifies a name for the target
Reporting Services folder. Optional. If unspecified, Speech Application
Reports is used."
I have tried everything in terms of different methods of designating a
subfolder such as:
/reportfolder:myReportFolder/myReport
/reportfolder:myReportFolder\myReport
/reportfolder:myReportFolder%2fmyReport
Any attempt to create a subdirectory structure on the Reporting Server
has either failed altogether with this excerpt error:
Failed to create folder for reports on the specified SQL Reporting
Services host:
The name of the item 'myReportFolder/my
Report is not valid. The name must be less than 260 characters long,
must not start with slash; other restrictions apply. Check the
documentation for complete set of restrictions.
Or it creates a single folder on the homepage with a name like,
myReportFolder%2fmyReport.
Unfortunately, I have suspicion that the error is very vague and a bit
too generic to indicate the actual cause of failure since complying
with the character restraints still doesn't produce the subdirectory
structure.
I have even explored the data structure and layout for the webcontent
in the catalogs table of the Reports database to hopefully find a clue
but to no satisfactory resolve.
In a nutshell, there is only one of 2 answers for this problem.
1. The utility is limited in this capacity and since there isn't any
method of resolving or working around the problem to get the desired
results.
2. The utility is capable of performing the requested action, but the
documentation is very poor when it comes to more elaborative
descriptions of its features.
If it is the latter of the 2 and someone can provide a viable
workaround or answer, that would be great!!!!
Thanks,
MarkWish that I could help.. I've got some reports that have paths that are
~100 characters; etc and they are coughing on this same error message;
but it's not specific to MSSLogToDatabase utility
I had heard it's a bug with a bad error description; I wish I could
find the resolution.
Does anyone know which characters we're allowed to use in a report
title'
-Aaron
mivey4 wrote:
> Hi,
> I am creating a report using MSS and the Microsoft utility
> MSSLogToDatabase utility. The utility seems to work great until there
> is an attempt to specify a subdirectory for the /reportfolder
> parameter.
> I have searched far and wide in diverse newsgroups, google, microsoft's
> site and etc. but can't seem to get a definitive answer as to whether
> or not there is a way to have the report created on the Reporting
> Server Website that is contained within another folder.
> Documentation is very concise and I can't find any additional
> information for using the utility. The description for the
> /reportfolder:value switch is "Specifies a name for the target
> Reporting Services folder. Optional. If unspecified, Speech Application
> Reports is used."
> I have tried everything in terms of different methods of designating a
> subfolder such as:
> /reportfolder:myReportFolder/myReport
> /reportfolder:myReportFolder\myReport
> /reportfolder:myReportFolder%2fmyReport
> Any attempt to create a subdirectory structure on the Reporting Server
> has either failed altogether with this excerpt error:
> Failed to create folder for reports on the specified SQL Reporting
> Services host:
> The name of the item 'myReportFolder/my
> Report is not valid. The name must be less than 260 characters long,
> must not start with slash; other restrictions apply. Check the
> documentation for complete set of restrictions.
> Or it creates a single folder on the homepage with a name like,
> myReportFolder%2fmyReport.
> Unfortunately, I have suspicion that the error is very vague and a bit
> too generic to indicate the actual cause of failure since complying
> with the character restraints still doesn't produce the subdirectory
> structure.
> I have even explored the data structure and layout for the webcontent
> in the catalogs table of the Reports database to hopefully find a clue
> but to no satisfactory resolve.
> In a nutshell, there is only one of 2 answers for this problem.
> 1. The utility is limited in this capacity and since there isn't any
> method of resolving or working around the problem to get the desired
> results.
> 2. The utility is capable of performing the requested action, but the
> documentation is very poor when it comes to more elaborative
> descriptions of its features.
> If it is the latter of the 2 and someone can provide a viable
> workaround or answer, that would be great!!!!
> Thanks,
> Mark|||Hi Aaron,
I may be able to provide some assistance to you concerning your
inquiry.
I have found that the following set of rules apply to virtually any
path designated within MSS because it is based on a web-based platform:
A. The full name of the item (including the path) cannot exceed 260
characters.
B. Names cannot be null, consist of any empty string
C. Names cannot include any of these reserved characters: (for URL
paths)
: ? ; @. ' = + $ , \ * > < | . " : =D. You cannot end an item name with a backslash character.
Although my issue was a bit different the same sets of rules still
applied because I was attempting to specify a path to the report via
the use of \.
When I attempted to use / I believe the error was thrown simply because
the tool was limited to creating a directory structure in the root only
in addition to the fact that the / was probably reserved as a switch
parameter at the command line from which the utility was run as well,
because the / wasn't an invalid URL character.
As far as naming your reports goes, there aren't any restrictions that
I am aware of that would potentially prohibit you from naming your
reports whatever you desire. I proved this by enclosing the report name
in double quotes which explicitly designate every character within the
quotes as being a literal string. e.g. "Reports/myReports" and
"Reports\myReports".
The problem was by doing so, the intended path was interpreted as a
literal string and not as the path. I was then left with a Report
called Reports/myReports.
I couldn't resolve the problem since it was apparently not a problem
but a mere limitation of the utility. I did confirm that short of this
issue, the operation didn't totally crash but continued gracefully with
the remaining tasks.
I hope I have helped clear up some of your questions. If not, e-mail me
and I'll assist in anyway I can.
Thanks!
Mark
aaron.kempf@.gmail.com wrote:
> Wish that I could help.. I've got some reports that have paths that are
> ~100 characters; etc and they are coughing on this same error message;
> but it's not specific to MSSLogToDatabase utility
> I had heard it's a bug with a bad error description; I wish I could
> find the resolution.
> Does anyone know which characters we're allowed to use in a report
> title'
> -Aaron
>
> mivey4 wrote:
> > Hi,
> >
> > I am creating a report using MSS and the Microsoft utility
> > MSSLogToDatabase utility. The utility seems to work great until there
> > is an attempt to specify a subdirectory for the /reportfolder
> > parameter.
> >
> > I have searched far and wide in diverse newsgroups, google, microsoft's
> > site and etc. but can't seem to get a definitive answer as to whether
> > or not there is a way to have the report created on the Reporting
> > Server Website that is contained within another folder.
> >
> > Documentation is very concise and I can't find any additional
> > information for using the utility. The description for the
> > /reportfolder:value switch is "Specifies a name for the target
> > Reporting Services folder. Optional. If unspecified, Speech Application
> > Reports is used."
> >
> > I have tried everything in terms of different methods of designating a
> > subfolder such as:
> > /reportfolder:myReportFolder/myReport
> > /reportfolder:myReportFolder\myReport
> > /reportfolder:myReportFolder%2fmyReport
> >
> > Any attempt to create a subdirectory structure on the Reporting Server
> > has either failed altogether with this excerpt error:
> > Failed to create folder for reports on the specified SQL Reporting
> > Services host:
> >
> > The name of the item 'myReportFolder/my
> > Report is not valid. The name must be less than 260 characters long,
> > must not start with slash; other restrictions apply. Check the
> > documentation for complete set of restrictions.
> >
> > Or it creates a single folder on the homepage with a name like,
> > myReportFolder%2fmyReport.
> >
> > Unfortunately, I have suspicion that the error is very vague and a bit
> > too generic to indicate the actual cause of failure since complying
> > with the character restraints still doesn't produce the subdirectory
> > structure.
> >
> > I have even explored the data structure and layout for the webcontent
> > in the catalogs table of the Reports database to hopefully find a clue
> > but to no satisfactory resolve.
> >
> > In a nutshell, there is only one of 2 answers for this problem.
> >
> > 1. The utility is limited in this capacity and since there isn't any
> > method of resolving or working around the problem to get the desired
> > results.
> > 2. The utility is capable of performing the requested action, but the
> > documentation is very poor when it comes to more elaborative
> > descriptions of its features.
> >
> > If it is the latter of the 2 and someone can provide a viable
> > workaround or answer, that would be great!!!!
> >
> > Thanks,
> > Mark|||lol it was actually a SPACE
filename_.rdl with the underscore representing a SPACE
dont know how I missed it; seems like it should have automatically
fixed itself; but i found it; fixed a half dozen reports with minimal
effort
-Aaron
mivey4 wrote:
> Hi Aaron,
> I may be able to provide some assistance to you concerning your
> inquiry.
> I have found that the following set of rules apply to virtually any
> path designated within MSS because it is based on a web-based platform:
> A. The full name of the item (including the path) cannot exceed 260
> characters.
> B. Names cannot be null, consist of any empty string
> C. Names cannot include any of these reserved characters: (for URL
> paths)
> : ? ; @. ' = + $ , \ * > < | . " : => D. You cannot end an item name with a backslash character.
> Although my issue was a bit different the same sets of rules still
> applied because I was attempting to specify a path to the report via
> the use of \.
> When I attempted to use / I believe the error was thrown simply because
> the tool was limited to creating a directory structure in the root only
> in addition to the fact that the / was probably reserved as a switch
> parameter at the command line from which the utility was run as well,
> because the / wasn't an invalid URL character.
> As far as naming your reports goes, there aren't any restrictions that
> I am aware of that would potentially prohibit you from naming your
> reports whatever you desire. I proved this by enclosing the report name
> in double quotes which explicitly designate every character within the
> quotes as being a literal string. e.g. "Reports/myReports" and
> "Reports\myReports".
> The problem was by doing so, the intended path was interpreted as a
> literal string and not as the path. I was then left with a Report
> called Reports/myReports.
> I couldn't resolve the problem since it was apparently not a problem
> but a mere limitation of the utility. I did confirm that short of this
> issue, the operation didn't totally crash but continued gracefully with
> the remaining tasks.
> I hope I have helped clear up some of your questions. If not, e-mail me
> and I'll assist in anyway I can.
> Thanks!
> Mark
>
> aaron.kempf@.gmail.com wrote:
> > Wish that I could help.. I've got some reports that have paths that are
> > ~100 characters; etc and they are coughing on this same error message;
> > but it's not specific to MSSLogToDatabase utility
> >
> > I had heard it's a bug with a bad error description; I wish I could
> > find the resolution.
> >
> > Does anyone know which characters we're allowed to use in a report
> > title'
> >
> > -Aaron
> >
> >
> >
> > mivey4 wrote:
> > > Hi,
> > >
> > > I am creating a report using MSS and the Microsoft utility
> > > MSSLogToDatabase utility. The utility seems to work great until there
> > > is an attempt to specify a subdirectory for the /reportfolder
> > > parameter.
> > >
> > > I have searched far and wide in diverse newsgroups, google, microsoft's
> > > site and etc. but can't seem to get a definitive answer as to whether
> > > or not there is a way to have the report created on the Reporting
> > > Server Website that is contained within another folder.
> > >
> > > Documentation is very concise and I can't find any additional
> > > information for using the utility. The description for the
> > > /reportfolder:value switch is "Specifies a name for the target
> > > Reporting Services folder. Optional. If unspecified, Speech Application
> > > Reports is used."
> > >
> > > I have tried everything in terms of different methods of designating a
> > > subfolder such as:
> > > /reportfolder:myReportFolder/myReport
> > > /reportfolder:myReportFolder\myReport
> > > /reportfolder:myReportFolder%2fmyReport
> > >
> > > Any attempt to create a subdirectory structure on the Reporting Server
> > > has either failed altogether with this excerpt error:
> > > Failed to create folder for reports on the specified SQL Reporting
> > > Services host:
> > >
> > > The name of the item 'myReportFolder/my
> > > Report is not valid. The name must be less than 260 characters long,
> > > must not start with slash; other restrictions apply. Check the
> > > documentation for complete set of restrictions.
> > >
> > > Or it creates a single folder on the homepage with a name like,
> > > myReportFolder%2fmyReport.
> > >
> > > Unfortunately, I have suspicion that the error is very vague and a bit
> > > too generic to indicate the actual cause of failure since complying
> > > with the character restraints still doesn't produce the subdirectory
> > > structure.
> > >
> > > I have even explored the data structure and layout for the webcontent
> > > in the catalogs table of the Reports database to hopefully find a clue
> > > but to no satisfactory resolve.
> > >
> > > In a nutshell, there is only one of 2 answers for this problem.
> > >
> > > 1. The utility is limited in this capacity and since there isn't any
> > > method of resolving or working around the problem to get the desired
> > > results.
> > > 2. The utility is capable of performing the requested action, but the
> > > documentation is very poor when it comes to more elaborative
> > > descriptions of its features.
> > >
> > > If it is the latter of the 2 and someone can provide a viable
> > > workaround or answer, that would be great!!!!
> > >
> > > Thanks,
> > > Mark

Log Import Error

I have a data warehouse that imports all of the required profile,
transactions, logs, etc. on a daily basis and has been running fine. The
other night, I started to receive the following errors during the web log
import.
My package logging shows:
Step 'DTSStep_CS_DTSLogImps.DTSLogImport_1' failed
Step Error Source: Microsoft Data Transformation Services (DTS) Package
Step Error Description:The task reported failure on execution.
Step Error code: 8004043B
Step Error Help File:sqldts80.hlp
Step Error Help Context ID:700
Step Execution Started: 9/19/2005 6:38:35 PM
Step Execution Completed: 9/19/2005 6:38:44 PM
Total Step Execution Time: 8.532 seconds
Progress count in Step: 0
The SQL job logging shows:
DTSRun OnError: DTSStep_CS_DTSLogImps.DTSLogImport_1, Error = -2147220421
(8004043B)
Error string: The task reported failure on execution.
Error source: Microsoft Data Transformation Services (DTS) Package
Help file: sqldts80.hlp
Help context: 700
Error Detail Records:
Error: -2147220421 (8004043B); Provider Error: 0 (0)
Error string: The task reported failure on execution.
Error source: Microsoft Data Transformation Services (DTS) Package
Help file: sqldts80.hlp
Help context: 700
The application event viewer has the following:
Event Type: Error
Event Source: CSDataWareHouse
Event Category: None
Event ID: 49176
Date: 9/19/2005
Time: 8:14:49 PM
User: N/A
Computer: INCDCPRDWSQL02
Description:
Log Import Task :Failed to import logs for site : StaplesCV
Event Type: Error
Event Source: Commerce Server 2002
Event Category: None
Event ID: 33123
Date: 9/19/2005
Time: 8:14:49 PM
User: N/A
Computer: INCDCPRDWSQL02
Description:
Log Import Task did not import any hits
Event Type: Error
Event Source: Commerce Server 2002
Event Category: None
Event ID: 33122
Date: 9/19/2005
Time: 8:14:49 PM
User: N/A
Computer: INCDCPRDWSQL02
Description:
Error launching parser. status code : -2147467259 (-2147467259)
I have searched but can not find anything related to an import process that
was functioning previously. Has anyone run into this problem before? Thank
s
- RichHello,
I replied you in microsoft.public.sqlserver.dts newsgroup. You may want to
go there for follow up.
Best Regards,
Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.
| Thread-Topic: Log Import Error
| thread-index: AcW+Bx7V6bTvIaN9TKyz1uqn7me4Ew==
| X-WBNR-Posting-Host: 66.181.92.2
| From: "examnotes" <rlanoue@.community.nospam>
| Subject: Log Import Error
| Date: Tue, 20 Sep 2005 10:17:04 -0700
| Lines: 70
| Message-ID: <6463656E-0E6C-4E5A-B329-985C948DB90B@.microsoft.com>
| MIME-Version: 1.0
| Content-Type: text/plain;
| charset="Utf-8"
| Content-Transfer-Encoding: 7bit
| X-Newsreader: Microsoft CDO for Windows 2000
| Content-Class: urn:content-classes:message
| Importance: normal
| Priority: normal
| X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.0
| Newsgroups: microsoft.public.sqlserver.datawarehouse
| NNTP-Posting-Host: TK2MSFTNGXA03.phx.gbl 10.40.2.250
| Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGXA03.phx.gbl
| Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.datawarehouse:2193
| X-Tomcat-NG: microsoft.public.sqlserver.datawarehouse
|
| I have a data warehouse that imports all of the required profile,
| transactions, logs, etc. on a daily basis and has been running fine. The
| other night, I started to receive the following errors during the web log
| import.
|
| My package logging shows:
| Step 'DTSStep_CS_DTSLogImps.DTSLogImport_1' failed
| Step Error Source: Microsoft Data Transformation Services (DTS) Package
| Step Error Description:The task reported failure on execution.
| Step Error code: 8004043B
| Step Error Help File:sqldts80.hlp
| Step Error Help Context ID:700
| Step Execution Started: 9/19/2005 6:38:35 PM
| Step Execution Completed: 9/19/2005 6:38:44 PM
| Total Step Execution Time: 8.532 seconds
| Progress count in Step: 0
|
| The SQL job logging shows:
| DTSRun OnError: DTSStep_CS_DTSLogImps.DTSLogImport_1, Error =
-2147220421
| (8004043B)
| Error string: The task reported failure on execution.
| Error source: Microsoft Data Transformation Services (DTS) Package
| Help file: sqldts80.hlp
| Help context: 700
| Error Detail Records:
| Error: -2147220421 (8004043B); Provider Error: 0 (0)
| Error string: The task reported failure on execution.
| Error source: Microsoft Data Transformation Services (DTS) Package
| Help file: sqldts80.hlp
| Help context: 700
|
| The application event viewer has the following:
| Event Type: Error
| Event Source: CSDataWareHouse
| Event Category: None
| Event ID: 49176
| Date: 9/19/2005
| Time: 8:14:49 PM
| User: N/A
| Computer: INCDCPRDWSQL02
| Description:
| Log Import Task :Failed to import logs for site : StaplesCV
|
| Event Type: Error
| Event Source: Commerce Server 2002
| Event Category: None
| Event ID: 33123
| Date: 9/19/2005
| Time: 8:14:49 PM
| User: N/A
| Computer: INCDCPRDWSQL02
| Description:
| Log Import Task did not import any hits
|
| Event Type: Error
| Event Source: Commerce Server 2002
| Event Category: None
| Event ID: 33122
| Date: 9/19/2005
| Time: 8:14:49 PM
| User: N/A
| Computer: INCDCPRDWSQL02
| Description:
| Error launching parser. status code : -2147467259 (-2147467259)
|
| I have searched but can not find anything related to an import process
that
| was functioning previously. Has anyone run into this problem before?
Thanks
|
| - Rich
|
|

Log Import Error

I have a data warehouse that imports all of the required profile,
transactions, logs, etc. on a daily basis and has been running fine. The
other night, I started to receive the following errors during the web log
import.
My package logging shows:
Step 'DTSStep_CS_DTSLogImps.DTSLogImport_1' failed
Step Error Source: Microsoft Data Transformation Services (DTS) Package
Step Error Description:The task reported failure on execution.
Step Error code: 8004043B
Step Error Help File:sqldts80.hlp
Step Error Help Context ID:700
Step Execution Started: 9/19/2005 6:38:35 PM
Step Execution Completed: 9/19/2005 6:38:44 PM
Total Step Execution Time: 8.532 seconds
Progress count in Step: 0
The SQL job logging shows:
DTSRun OnError: DTSStep_CS_DTSLogImps.DTSLogImport_1, Error = -2147220421
(8004043B)
Error string: The task reported failure on execution.
Error source: Microsoft Data Transformation Services (DTS) Package
Help file: sqldts80.hlp
Help context: 700
Error Detail Records:
Error: -2147220421 (8004043B); Provider Error: 0 (0)
Error string: The task reported failure on execution.
Error source: Microsoft Data Transformation Services (DTS) Package
Help file: sqldts80.hlp
Help context: 700
The application event viewer has the following:
Event Type:Error
Event Source:CSDataWareHouse
Event Category:None
Event ID:49176
Date:9/19/2005
Time:8:14:49 PM
User:N/A
Computer:INCDCPRDWSQL02
Description:
Log Import Task :Failed to import logs for site : StaplesCV
Event Type:Error
Event Source:Commerce Server 2002
Event Category:None
Event ID:33123
Date:9/19/2005
Time:8:14:49 PM
User:N/A
Computer:INCDCPRDWSQL02
Description:
Log Import Task did not import any hits
Event Type:Error
Event Source:Commerce Server 2002
Event Category:None
Event ID:33122
Date:9/19/2005
Time:8:14:49 PM
User:N/A
Computer:INCDCPRDWSQL02
Description:
Error launching parser. status code : -2147467259 (-2147467259)
I have searched but can not find anything related to an import process that
was functioning previously. Has anyone run into this problem before? Thanks
- Rich
Hello,
I replied you in microsoft.public.sqlserver.dts newsgroup. You may want to
go there for follow up.
Best Regards,
Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.
| Thread-Topic: Log Import Error
| thread-index: AcW+Bx7V6bTvIaN9TKyz1uqn7me4Ew==
| X-WBNR-Posting-Host: 66.181.92.2
| From: "=?Utf-8?B?Ukxhbm91ZQ==?=" <rlanoue@.community.nospam>
| Subject: Log Import Error
| Date: Tue, 20 Sep 2005 10:17:04 -0700
| Lines: 70
| Message-ID: <6463656E-0E6C-4E5A-B329-985C948DB90B@.microsoft.com>
| MIME-Version: 1.0
| Content-Type: text/plain;
| charset="Utf-8"
| Content-Transfer-Encoding: 7bit
| X-Newsreader: Microsoft CDO for Windows 2000
| Content-Class: urn:content-classes:message
| Importance: normal
| Priority: normal
| X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.0
| Newsgroups: microsoft.public.sqlserver.datawarehouse
| NNTP-Posting-Host: TK2MSFTNGXA03.phx.gbl 10.40.2.250
| Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGXA03.phx.gbl
| Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.datawarehouse:2193
| X-Tomcat-NG: microsoft.public.sqlserver.datawarehouse
|
| I have a data warehouse that imports all of the required profile,
| transactions, logs, etc. on a daily basis and has been running fine. The
| other night, I started to receive the following errors during the web log
| import.
|
| My package logging shows:
| Step 'DTSStep_CS_DTSLogImps.DTSLogImport_1' failed
| Step Error Source: Microsoft Data Transformation Services (DTS) Package
| Step Error Description:The task reported failure on execution.
| Step Error code: 8004043B
| Step Error Help File:sqldts80.hlp
| Step Error Help Context ID:700
| Step Execution Started: 9/19/2005 6:38:35 PM
| Step Execution Completed: 9/19/2005 6:38:44 PM
| Total Step Execution Time: 8.532 seconds
| Progress count in Step: 0
|
| The SQL job logging shows:
| DTSRun OnError: DTSStep_CS_DTSLogImps.DTSLogImport_1, Error =
-2147220421
| (8004043B)
| Error string: The task reported failure on execution.
| Error source: Microsoft Data Transformation Services (DTS) Package
| Help file: sqldts80.hlp
| Help context: 700
| Error Detail Records:
| Error: -2147220421 (8004043B); Provider Error: 0 (0)
| Error string: The task reported failure on execution.
| Error source: Microsoft Data Transformation Services (DTS) Package
| Help file: sqldts80.hlp
| Help context: 700
|
| The application event viewer has the following:
| Event Type:Error
| Event Source:CSDataWareHouse
| Event Category:None
| Event ID:49176
| Date:9/19/2005
| Time:8:14:49 PM
| User:N/A
| Computer:INCDCPRDWSQL02
| Description:
| Log Import Task :Failed to import logs for site : StaplesCV
|
| Event Type:Error
| Event Source:Commerce Server 2002
| Event Category:None
| Event ID:33123
| Date:9/19/2005
| Time:8:14:49 PM
| User:N/A
| Computer:INCDCPRDWSQL02
| Description:
| Log Import Task did not import any hits
|
| Event Type:Error
| Event Source:Commerce Server 2002
| Event Category:None
| Event ID:33122
| Date:9/19/2005
| Time:8:14:49 PM
| User:N/A
| Computer:INCDCPRDWSQL02
| Description:
| Error launching parser. status code : -2147467259 (-2147467259)
|
| I have searched but can not find anything related to an import process
that
| was functioning previously. Has anyone run into this problem before?
Thanks
|
| - Rich
|
|

Wednesday, March 21, 2012

Log file is increeasing too much

Hi
i have a database and a proccess which delete everything from database and
then import some stuff from other databases.
how can i reduce to increase log file because sometimes import stops because
of it.
I use the following procedures :
backup log mydatabase with truncate_only
dbcc shrinkdatabase (mydatabase)
i was wondering if it's any possible way to put a maximum size for log file
and when it get to the maximum file size to override the existent logs and
not raise me the error that "The log file for database 'mydatabase' is full.
Back up the transaction log for the database to free up some log space."
thanks in advanceIf you are using the FULL recovery model, you would need to backup the log
before it can be reused. Otherwise, you might consider using the simple
recovery model.
Another possibility is that your import process is done within one
transaction, hence all the data is either committed or rolled back. If this
was the case (and you are using the simple recovery model), you might
consider breaking the import into smaller transactions.
If you are deleting everything initially, you might also consider using the
TRUNCATE TABLE option as the row deletion is not individually logged.
--
Peter Yeoh
http://www.yohz.com
Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> Hi
> i have a database and a proccess which delete everything from database and
> then import some stuff from other databases.
> how can i reduce to increase log file because sometimes import stops
because
> of it.
> I use the following procedures :
> backup log mydatabase with truncate_only
> dbcc shrinkdatabase (mydatabase)
> i was wondering if it's any possible way to put a maximum size for log
file
> and when it get to the maximum file size to override the existent logs and
> not raise me the error that "The log file for database 'mydatabase' is
full.
> Back up the transaction log for the database to free up some log space."
> thanks in advance|||Thanks
i think this was the problem. I use delete command instead of truncate. I
turned the recovery mode to simple and I put the option auto shrink on.
Thanks again.
"Peter Yeoh" wrote:
> If you are using the FULL recovery model, you would need to backup the log
> before it can be reused. Otherwise, you might consider using the simple
> recovery model.
> Another possibility is that your import process is done within one
> transaction, hence all the data is either committed or rolled back. If this
> was the case (and you are using the simple recovery model), you might
> consider breaking the import into smaller transactions.
> If you are deleting everything initially, you might also consider using the
> TRUNCATE TABLE option as the row deletion is not individually logged.
> --
> Peter Yeoh
> http://www.yohz.com
> Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
>
> "Psycho" <Psycho@.discussions.microsoft.com> wrote in message
> news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> > Hi
> > i have a database and a proccess which delete everything from database and
> > then import some stuff from other databases.
> > how can i reduce to increase log file because sometimes import stops
> because
> > of it.
> > I use the following procedures :
> >
> > backup log mydatabase with truncate_only
> > dbcc shrinkdatabase (mydatabase)
> >
> > i was wondering if it's any possible way to put a maximum size for log
> file
> > and when it get to the maximum file size to override the existent logs and
> > not raise me the error that "The log file for database 'mydatabase' is
> full.
> > Back up the transaction log for the database to free up some log space."
> >
> > thanks in advance
>
>|||You might want to check out my article regarding shrink:
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:AC3F1D5E-CA73-47D8-8EA0-D3FBF3315DAA@.microsoft.com...
> Thanks
> i think this was the problem. I use delete command instead of truncate. I
> turned the recovery mode to simple and I put the option auto shrink on.
> Thanks again.
> "Peter Yeoh" wrote:
>> If you are using the FULL recovery model, you would need to backup the log
>> before it can be reused. Otherwise, you might consider using the simple
>> recovery model.
>> Another possibility is that your import process is done within one
>> transaction, hence all the data is either committed or rolled back. If this
>> was the case (and you are using the simple recovery model), you might
>> consider breaking the import into smaller transactions.
>> If you are deleting everything initially, you might also consider using the
>> TRUNCATE TABLE option as the row deletion is not individually logged.
>> --
>> Peter Yeoh
>> http://www.yohz.com
>> Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
>>
>> "Psycho" <Psycho@.discussions.microsoft.com> wrote in message
>> news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
>> > Hi
>> > i have a database and a proccess which delete everything from database and
>> > then import some stuff from other databases.
>> > how can i reduce to increase log file because sometimes import stops
>> because
>> > of it.
>> > I use the following procedures :
>> >
>> > backup log mydatabase with truncate_only
>> > dbcc shrinkdatabase (mydatabase)
>> >
>> > i was wondering if it's any possible way to put a maximum size for log
>> file
>> > and when it get to the maximum file size to override the existent logs and
>> > not raise me the error that "The log file for database 'mydatabase' is
>> full.
>> > Back up the transaction log for the database to free up some log space."
>> >
>> > thanks in advance
>>|||Also Truncate table does NOT work on tables with FKs, so you'll have to
remove those first..
Another alternative is to delete a range of rows ie
set rowcount 10000
delete from mytable
while @.@.rowcount !=0
delete from mytable
set rowcount 0
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> Hi
> i have a database and a proccess which delete everything from database and
> then import some stuff from other databases.
> how can i reduce to increase log file because sometimes import stops
because
> of it.
> I use the following procedures :
> backup log mydatabase with truncate_only
> dbcc shrinkdatabase (mydatabase)
> i was wondering if it's any possible way to put a maximum size for log
file
> and when it get to the maximum file size to override the existent logs and
> not raise me the error that "The log file for database 'mydatabase' is
full.
> Back up the transaction log for the database to free up some log space."
> thanks in advance

Monday, March 19, 2012

Log file is increeasing too much

Hi
i have a database and a proccess which delete everything from database and
then import some stuff from other databases.
how can i reduce to increase log file because sometimes import stops because
of it.
I use the following procedures :
backup log mydatabase with truncate_only
dbcc shrinkdatabase (mydatabase)
i was wondering if it's any possible way to put a maximum size for log file
and when it get to the maximum file size to override the existent logs and
not raise me the error that "The log file for database 'mydatabase' is full.
Back up the transaction log for the database to free up some log space."
thanks in advanceIf you are using the FULL recovery model, you would need to backup the log
before it can be reused. Otherwise, you might consider using the simple
recovery model.
Another possibility is that your import process is done within one
transaction, hence all the data is either committed or rolled back. If this
was the case (and you are using the simple recovery model), you might
consider breaking the import into smaller transactions.
If you are deleting everything initially, you might also consider using the
TRUNCATE TABLE option as the row deletion is not individually logged.
Peter Yeoh
http://www.yohz.com
Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> Hi
> i have a database and a proccess which delete everything from database and
> then import some stuff from other databases.
> how can i reduce to increase log file because sometimes import stops
because
> of it.
> I use the following procedures :
> backup log mydatabase with truncate_only
> dbcc shrinkdatabase (mydatabase)
> i was wondering if it's any possible way to put a maximum size for log
file
> and when it get to the maximum file size to override the existent logs and
> not raise me the error that "The log file for database 'mydatabase' is
full.
> Back up the transaction log for the database to free up some log space."
> thanks in advance|||Thanks
i think this was the problem. I use delete command instead of truncate. I
turned the recovery mode to simple and I put the option auto shrink on.
Thanks again.
"Peter Yeoh" wrote:

> If you are using the FULL recovery model, you would need to backup the log
> before it can be reused. Otherwise, you might consider using the simple
> recovery model.
> Another possibility is that your import process is done within one
> transaction, hence all the data is either committed or rolled back. If th
is
> was the case (and you are using the simple recovery model), you might
> consider breaking the import into smaller transactions.
> If you are deleting everything initially, you might also consider using th
e
> TRUNCATE TABLE option as the row deletion is not individually logged.
> --
> Peter Yeoh
> http://www.yohz.com
> Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
>
> "Psycho" <Psycho@.discussions.microsoft.com> wrote in message
> news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> because
> file
> full.
>
>|||You might want to check out my article regarding shrink:
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:AC3F1D5E-CA73-47D8-8EA0-D3FBF3315DAA@.microsoft.com...[vbcol=seagreen]
> Thanks
> i think this was the problem. I use delete command instead of truncate. I
> turned the recovery mode to simple and I put the option auto shrink on.
> Thanks again.
> "Peter Yeoh" wrote:
>|||Also Truncate table does NOT work on tables with FKs, so you'll have to
remove those first..
Another alternative is to delete a range of rows ie
set rowcount 10000
delete from mytable
while @.@.rowcount !=0
delete from mytable
set rowcount 0
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> Hi
> i have a database and a proccess which delete everything from database and
> then import some stuff from other databases.
> how can i reduce to increase log file because sometimes import stops
because
> of it.
> I use the following procedures :
> backup log mydatabase with truncate_only
> dbcc shrinkdatabase (mydatabase)
> i was wondering if it's any possible way to put a maximum size for log
file
> and when it get to the maximum file size to override the existent logs and
> not raise me the error that "The log file for database 'mydatabase' is
full.
> Back up the transaction log for the database to free up some log space."
> thanks in advance

Log file is increeasing too much

Hi
i have a database and a proccess which delete everything from database and
then import some stuff from other databases.
how can i reduce to increase log file because sometimes import stops because
of it.
I use the following procedures :
backup log mydatabase with truncate_only
dbcc shrinkdatabase (mydatabase)
i was wondering if it's any possible way to put a maximum size for log file
and when it get to the maximum file size to override the existent logs and
not raise me the error that "The log file for database 'mydatabase' is full.
Back up the transaction log for the database to free up some log space."
thanks in advance
If you are using the FULL recovery model, you would need to backup the log
before it can be reused. Otherwise, you might consider using the simple
recovery model.
Another possibility is that your import process is done within one
transaction, hence all the data is either committed or rolled back. If this
was the case (and you are using the simple recovery model), you might
consider breaking the import into smaller transactions.
If you are deleting everything initially, you might also consider using the
TRUNCATE TABLE option as the row deletion is not individually logged.
Peter Yeoh
http://www.yohz.com
Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> Hi
> i have a database and a proccess which delete everything from database and
> then import some stuff from other databases.
> how can i reduce to increase log file because sometimes import stops
because
> of it.
> I use the following procedures :
> backup log mydatabase with truncate_only
> dbcc shrinkdatabase (mydatabase)
> i was wondering if it's any possible way to put a maximum size for log
file
> and when it get to the maximum file size to override the existent logs and
> not raise me the error that "The log file for database 'mydatabase' is
full.
> Back up the transaction log for the database to free up some log space."
> thanks in advance
|||Thanks
i think this was the problem. I use delete command instead of truncate. I
turned the recovery mode to simple and I put the option auto shrink on.
Thanks again.
"Peter Yeoh" wrote:

> If you are using the FULL recovery model, you would need to backup the log
> before it can be reused. Otherwise, you might consider using the simple
> recovery model.
> Another possibility is that your import process is done within one
> transaction, hence all the data is either committed or rolled back. If this
> was the case (and you are using the simple recovery model), you might
> consider breaking the import into smaller transactions.
> If you are deleting everything initially, you might also consider using the
> TRUNCATE TABLE option as the row deletion is not individually logged.
> --
> Peter Yeoh
> http://www.yohz.com
> Need smaller SQL2K backup files? Use MiniSQLBackup Lite, free!
>
> "Psycho" <Psycho@.discussions.microsoft.com> wrote in message
> news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> because
> file
> full.
>
>
|||You might want to check out my article regarding shrink:
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:AC3F1D5E-CA73-47D8-8EA0-D3FBF3315DAA@.microsoft.com...[vbcol=seagreen]
> Thanks
> i think this was the problem. I use delete command instead of truncate. I
> turned the recovery mode to simple and I put the option auto shrink on.
> Thanks again.
> "Peter Yeoh" wrote:
|||Also Truncate table does NOT work on tables with FKs, so you'll have to
remove those first..
Another alternative is to delete a range of rows ie
set rowcount 10000
delete from mytable
while @.@.rowcount !=0
delete from mytable
set rowcount 0
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Psycho" <Psycho@.discussions.microsoft.com> wrote in message
news:26E3911D-CEA7-4389-8E26-1E76B3EEDF11@.microsoft.com...
> Hi
> i have a database and a proccess which delete everything from database and
> then import some stuff from other databases.
> how can i reduce to increase log file because sometimes import stops
because
> of it.
> I use the following procedures :
> backup log mydatabase with truncate_only
> dbcc shrinkdatabase (mydatabase)
> i was wondering if it's any possible way to put a maximum size for log
file
> and when it get to the maximum file size to override the existent logs and
> not raise me the error that "The log file for database 'mydatabase' is
full.
> Back up the transaction log for the database to free up some log space."
> thanks in advance

Friday, March 9, 2012

Log Errors but Continue to Load Data?

Hello,

In my SSIS package I am trying to import a .csv flat file with about 170,000 rows of data with 75 columns (I know this is rather large). I would like to create a SSIS package that loads ALL of the rows of data into a table. This table has it's fields defined (such as money, float, varchar, datetime, etc). I want the data to load to this table even if there is a conversion error converting any field. When the data is loaded to this table, any fields that couldn't be converted should be set to null. Of the 75 columns of data, there are MANY columns that could be invalid.

For example, if the flat file contains the value "00/00/00" for my "PaidDate" field, I would want all of the other fields to be populated and the value for this particular record's "PaidDate" field to be null.

It would also be nice if I could trap any of the fields that caused a problem and log them along with the row.

I know that SSIS supports the "Redirect Row" method for handling data, but in my case, I don't want to redirect it. I want to continue to load it, just with a null value.

Is there an easy way of doing this (i.e. perhaps by using the Advanced Editor for the Flat File Source and setting something in the Input/Output columns)? I really don't want to create a Derived Column or Data Conversion transformation for every field that needs to be converted (b/c there are 75 columns to maintain this for).

I know if I import this file to an Access 2003 database, it imports the data fine and logs any conversion errors to an "_ConversionErrors" table. I am basically looking for this same behavior in SSIS, without having to maintain 75 columns.

TIA

There is one other way I can think of. If you change the output columns of your source to be the same data type as your destination and select "ignore" for all errors then it will just cause that one column of the row to not be loaded into the data flow. Of course you have no way of determining which rows these are and you also have to manage the metadata by hand so I doubt you'd be much better off than a data conversion transformation.|||

Thanks Brent!

I actually started going down this path (convert from the flat file source) right after I posted my initial question. It stinks that I can't identify the rows. I tried to do make the conversion/truncation for each "redirect" to another route and then join them all back together using the UNION ALL, but it turns out you only get three values back from the ErrorOutput columns...A stream reader, an error code, and an error column. This isn't enough information to get back to the row that caused the error without some nasty parsing logic I assume.

Would be nice if there was a way to funnel that stream and re-create a row from it!

Thanks again!