Showing posts with label project. Show all posts
Showing posts with label project. Show all posts

Tuesday, March 20, 2012

[Transfer SQL Server Objects Task] Error: Table "XXXXXXX" does not exist at the source

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.

[Transfer SQL Server Objects Task] Error: Table "XXXXXXX" does not exist at the source

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen
|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère

|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.

[Transfer SQL Server Objects Task] Error: Table "XXXXXXX" does not exist at the so

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.

[Transfer SQL Server Objects Task] Error: Table "XXXXXXX" does not exist at the so

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.sql

[Transfer SQL Server Objects Task] Error: Table "XXXXXXX" does not exist at the so

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.

[Transfer SQL Server Objects Task] Error: Table "MySchema.MyTableName" does not ex

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.

[Transfer SQL Server Objects Task] Error: Table "MySchema.MyTableName" does not ex

Does anyone know what could be causing the error on Transfer SQL Server Objects Task? I tried to develope a SSIS project in the Business Intelligence studio to transfer table between databases on the same server. However, I have been getting the following error:

[Transfer SQL Server Objects Task] Error: Table "XXXXXX" does not exist at the source.

Is there a setting that I need to change to make this work? Thank you for your help.

Is the table you've specified to move on the task in the source database?|||

Yes. It is in there. I can see it in the selection list after I select the database. Thanks.

And Transfer SQL Server Objecdts Task is the only task in that SSIS package.

|||I think your problem is related to the schema of the table you are trying to copy.

I created table "testtable" in testdb1. The schema was "dbo". I could transfer this table to another database named testdb2 in the same server using Transfer SQL Server Objects Task.

Then, I changed the schema of "testtable" to "guest". When I executed the same task, I got the error message "[Transfer SQL Server Objects Task] Error: Table "testtable" does not exist at the source. "|||

Thanks for your response. Where di you create the "Transfer SQL Server Objects Task"? Is it under control flow or data flow? When I had the error to only task I had was the "Transfer SQL SErver Obejcts Task" under Control Flow.

Thanks.

|||Transfer SQL Server Objects is a Control Flow task. I had my task in the control flow tab and the task I have explained above was the only task I had in the package.|||I had the exact issue when I transfer the data (both table and xml schema) from my dev. box to the production server. It seemed to me that this particular control task recognizes only the dbo schema. Is this bug fixed in SP1?|||No. We are looking to fix this issue in one of the future releases.|||I am trying to figure out exactly what the issue is here -- can we not copy tables from one database to another at all at this time? I had both tables as dbo schemas, and I had a different error, this time about the destination not being available. I set the schema owner of the tables to be the same as the authenticated user, and then I received the error that it does not exist at the source.

At this time (10/2006), is there any way to copy tables from one server to another via SSIS? I would think this basic of a task would be the first to work on a newer DTS... ?

Thanks in advance

-Chris Rasmussen|||

I am hoping someone will answer the last question. I cannot beleive you would release a product with such a simple task not working. As a workaround, I suppose I can create an SMO task to handle this, but that is more development time than I had expected to use. Please fix this with your NEXT release, not an undetermined FUTURE release. This type of missing functionality is hard to point at and convince anyone in the position of making decisions to migrate to 2005. Come on guys and gals.

Your frustrated supporter...

|||

Hi Cliff!

I'd like to share my frustration with Transfer SQL Server Object Task. I am trying to do something very simple - transfer 3 tables with primary/foreign keys, referential ingegrity and some data and spent hours of fruitless tweaking of the task. If someone offers a very simple step by step process on how to configure the "Transfer SQL Server Object task" to do that, I would be very grateful.

|||Are you planning on fixing this in SP2?|||Hi

I'd like to add my vote for this feature request. Can someone from the dev team tell if the decision of fixing this has already been taken somehow ?

Any insight will be most welcome, and will help us pick the right solution.

Thibaut Barrère|||

Hi,

I am also having problems with the "Transfer SQL Server Objects" task. It simply does not work.

The editor interface is sweet, however, and looks very promising. My issue is that I cannot debug this task. I cannot see the code that is generated by the task, and logging does not reveal what I want: the SQL code or whatever code is generated by the task. I want to see the actual table name syntax contained in the resultant command sent over, with the assumption that there is something wrong there.

The documentation is very clear, promising effective results. The reality is different, however. This is simply a management issue. The product was released too early, and the testing regimen must have been inadequate or mismatched with the document.

I have the Evaluation Version of Enterprise. The SMO connectors all test properly, and there is no other problem. The wizard works fine from SSMS, and when I save the output from the wizard to a package, the package does not use the "Transfer SQL Server Objects" task.

So, guys, what we have here is a defect.

This is very annoying, and I will write a letter to Bill Gates requesting immediate attention to this issue.

|||The fix is not released yet.

Sunday, March 11, 2012

[RFI] Sql Server 2k- Error Handling strategy

I am currently trying to put together a document that would outline the
Error handling strategy for a SQL Server 2k based project.
The solution is based on some DTS (mainly ActiveX scripts) but mostly
Stored procedures are doing the bulk of the work.
As I am new to this technology so I thought I would ask someone on the
list that has experience in this area.
Therefore, I am looking for some material/tips/documentation such as
best Practices, Do's & Don't's, issues to consider etc.
or anything that you might think would be useful.Uzy
http://www.sommarskog.se/error-handling-II.html
"Uzy" <usmanlatif77@.gmail.com> wrote in message
news:1131619531.055286.192370@.z14g2000cwz.googlegroups.com...
>I am currently trying to put together a document that would outline the
> Error handling strategy for a SQL Server 2k based project.
> The solution is based on some DTS (mainly ActiveX scripts) but mostly
> Stored procedures are doing the bulk of the work.
> As I am new to this technology so I thought I would ask someone on the
> list that has experience in this area.
> Therefore, I am looking for some material/tips/documentation such as
> best Practices, Do's & Don't's, issues to consider etc.
> or anything that you might think would be useful.
>|||Thanks Uri
Uzy wrote:
> I am currently trying to put together a document that would outline the
> Error handling strategy for a SQL Server 2k based project.
> The solution is based on some DTS (mainly ActiveX scripts) but mostly
> Stored procedures are doing the bulk of the work.
> As I am new to this technology so I thought I would ask someone on the
> list that has experience in this area.
> Therefore, I am looking for some material/tips/documentation such as
> best Practices, Do's & Don't's, issues to consider etc.
> or anything that you might think would be useful.

[RFI] Sql Server 2k- Error Handling strategy

I am currently trying to put together a document that would outline the
Error handling strategy for a SQL Server 2k based project.
The solution is based on some DTS (mainly ActiveX scripts) but mostly
Stored procedures are doing the bulk of the work.
As I am new to this technology so I thought I would ask someone on the
list that has experience in this area.
Therefore, I am looking for some material/tips/documentation such as
best Practices, Do's & Don't's, issues to consider etc.
or anything that you might think would be useful.Uzy
http://www.sommarskog.se/error-handling-II.html
"Uzy" <usmanlatif77@.gmail.com> wrote in message
news:1131619531.055286.192370@.z14g2000cwz.googlegroups.com...
>I am currently trying to put together a document that would outline the
> Error handling strategy for a SQL Server 2k based project.
> The solution is based on some DTS (mainly ActiveX scripts) but mostly
> Stored procedures are doing the bulk of the work.
> As I am new to this technology so I thought I would ask someone on the
> list that has experience in this area.
> Therefore, I am looking for some material/tips/documentation such as
> best Practices, Do's & Don't's, issues to consider etc.
> or anything that you might think would be useful.
>

[RFI] Sql Server 2k- Error Handling strategy

I am currently trying to put together a document that would outline the
Error handling strategy for a SQL Server 2k based project.
The solution is based on some DTS (mainly ActiveX scripts) but mostly
Stored procedures are doing the bulk of the work.
As I am new to this technology so I thought I would ask someone on the
list that has experience in this area.
Therefore, I am looking for some material/tips/documentation such as
best Practices, Do's & Don't's, issues to consider etc.
or anything that you might think would be useful.
Uzy
http://www.sommarskog.se/error-handling-II.html
"Uzy" <usmanlatif77@.gmail.com> wrote in message
news:1131619531.055286.192370@.z14g2000cwz.googlegr oups.com...
>I am currently trying to put together a document that would outline the
> Error handling strategy for a SQL Server 2k based project.
> The solution is based on some DTS (mainly ActiveX scripts) but mostly
> Stored procedures are doing the bulk of the work.
> As I am new to this technology so I thought I would ask someone on the
> list that has experience in this area.
> Therefore, I am looking for some material/tips/documentation such as
> best Practices, Do's & Don't's, issues to consider etc.
> or anything that you might think would be useful.
>

Friday, February 24, 2012

[HELP!] How do you make the database path relative in the connection string?

Hi everyone,

I am working on a Hospital Information System project with a team of 6.Each one of us builds or modifies a part of the system and shares the project over a common repository. The problem here is that we use the absolute path of the database in our connection string. Although the directory structure of our project is the same for each member, since the main project folder itself is stored in different locations on each person's computer (for example, one may have it stored in c:\My Documents and the other in d:\ My Documents), we are forced to modify the connection string manually each time someone else from the team updates the repository with the database. How do we make the path of the database relative so that we don't have to modify the connection string manually each time after receiving an update from the repository?

Your help is greatly appreciated.

If you're referencing the path, then I wonder what database you are using. e.g. for MSSQL you would refer to a server and the database on that server, generally just '.' for a local server

Lot's of ways you could do this, e.g. I've just been playing with clickonce install and this is one I use;

connectionString="Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\dbRaptor_Data.mdf;Integrated Security=True;User Instance=True"

Thursday, February 16, 2012

[C#] Failing to use SCOPY_IDENTITY()

Dear Reader,

Currently I am building an application for a theme park where I work as a trainee for school, one project for me is to rebuild all the hundreds of databases into a few sql driven application's. Now I got a problem whit the use of SCOPE_IDENTITY(). Because the data has to be correct before inserting it into the database I use the transact features of .NET and I create 1 SQL string wich I dump in that method. The problem is that I can't be able to use the value of SCOPE_IDENTITY() for some reason, maybe you guys see a mistake in the actual (dynamic) query:
Here is the query built up by my program to write the data (of a single form) into the database:


DECLARE @.OID int;
INSERT INTO Medisch (med_za_ID, med_WeekNr, med_Enen, med_Bijzonderheden, med_AfwijkendGedrag, med_SexGedrag, med_GemAdemhaling, med_GemHoesten, med_Temperatuur, med_Conditie, med_BloedGeprikt, med_Cupje, med_BasisVis, med_Eetlust, med_GemGewicht)VALUES(3,1123,,'','','',,,,'','False','False',45,'',);
SELECT @.OID = SCOPE_IDENTITY();
INSERT INTO Medisch_Medicijnen_Details (mmd_mmt_ID, mmd_med_ID, mmd_Hoeveelheid, mmd_Aantal) VALUES(@.OID, 2,23, 23 );

Everything else works unless the SCOPE_IDENTITY() things.
I hope someone can help me out fixing this mistake.
Tnx in advanced,

Grz.
StefanEXACTLY what is your problem? Does Medisch have an IDENTITY column? Why are you not passing any values for some rows (not sure this syntax would work). Why not just not include the column names in the column list?|||The problem is that SCOPE_IDENTITY() doesn't seem to return any values.
They don't have all valeus, because I did not fill in all the data in the form so it will insert a null value. The string is ok, only the part of returing the identity. (Medisch has an autoincrement column called med_ID).|||To make it clear, you are saying that @.OID is null?|||No, it does not seem to get any value :P. The following error ocurse:
Incorrect syntax near ',' wich occurs in the part

VALUES(@.OID, "+medI

so for some reason it does not want to get the value in the sqlvariabele @.OID.

For clearance, it will put that string in my first post inot the following method:


public bool SqlNieuweInvoer(string sqlDatabase, string sqlQuery)
{
SqlConnection conData = new SqlConnection("server="+strServerNaam+";" + "database="+sqlDatabase+";Trusted_Connection=yes");
SqlCommand comData = new SqlCommand(sqlQuery, conData);
conData.Open();

SqlTransaction TranData = conData.BeginTransaction();
comData.Transaction = TranData;

try
{
comData.ExecuteNonQuery();
TranData.Commit();
return true;
}
catch (Exception e)
{
TranData.Rollback();
MessageBox.Show(Convert.ToString(e));
return false;
}
finally
{
conData.Close();
}
}

|||Is this an error when you compile? If so, then it has NOTHING to do with the SQL Syntax, it is an error in how you are building the string in C# code.|||This error occurs when running the application.
The user is going to fill in the form and if he presses the "OK" button of that form the query will be generated (for example it has to indiviualy add the different kinds of medicine related to the medical week rapport (that's why I need the SCOPE_IDENTITY() function to give me the internal auto generated increment value of that medical report).|||This is the code that builts the query bye the way:


private string BouwQuery()
{
cstrInvoerQuery = cstrInvoerQuery + "DECLARE @.OID int;";
cstrInvoerQuery = cstrInvoerQuery + "INSERT INTO Medisch (med_za_ID, med_WeekNr, med_Enen, med_Bijzonderheden, med_AfwijkendGedrag, med_SexGedrag, med_GemAdemhaling, med_GemHoesten, med_Temperatuur, med_Conditie, med_BloedGeprikt, med_Cupje, med_BasisVis, med_Eetlust, med_GemGewicht)" +
"VALUES("+this.DierID+","+tbWeekNr.Text+","+tbEnen.Text+",'"+tbBijzonderheden.Text+"','"+tbAfwijkendGedrag.Text+"','"+tbSexGedrag.Text+"',"+tbAdemfrequentie.Text+","+tbHoesten.Text+","+tbTemperatuur.Text+",'"+gcbConditie.Text+"','"+cbBloedgeprikt.Checked+"','"+cbCupje.Checked+"',"+tbBasisVis.Text+",'"+tbEetlust.Text+"',"+tbGewicht.Text+");";
cstrInvoerQuery = cstrInvoerQuery + " SELECT @.OID = SCOPE_IDENTITY();";

string[] test = new string[] {};
int medID;
string strDelimiter = "\t[";
char[] delimiter = strDelimiter.ToCharArray();
foreach (string Item in lbMedicijnenSupplementen.Items)
{
if (!Item.StartsWith("Naam:"))
{
test = Item.Split(delimiter, 6);
medID = VerkrijgMedicijnSupplementID(test[0]);
if (medID != 0)
{
cstrInvoerQuery = cstrInvoerQuery + "INSERT INTO Medisch_Medicijnen_Details (mmd_mmt_ID, mmd_med_ID, mmd_Hoeveelheid, mmd_Aantal) VALUES(@.OID, "+medID+","+test[1]+", "+test[2]+");";
}
}
}

if(cbBloedgeprikt.Checked)
{
cstrInvoerQuery = cstrInvoerQuery + "UPDATE OmgevingsWaardes SET ow_LaatstBloedGeprikt='"+tbWeekNr.Text+"' WHERE ow_za_ID = "+this.DierID+";";
}

tbBijzonderheden.Text = cstrInvoerQuery;
return cstrInvoerQuery;
}

Maybe this helps you find me an answer :).|||Your problem is this. You are calling ExecuteNonQuery. This DOES NOT expect a result set to be returned. However, you ARE returning a result set (that is what SELECT @.OID... does).

Use ExecuteReader and get back a DataReader, or better use ExecuteScaler() and the return from that method can be cast to an integer.|||That might be helpfull, I thought the whole query was being excuted on the SQL server, but they actually being excecuted all @. once? That explaines allot, thanks, I will go and trie the other 2 things :).|||Whoohoo!!! I used the scalar 1 and it finally works, thank you very very much :).

Monday, February 13, 2012

[2.0] deploy of a Report Model ?

Hello,

I use a Report Model Project to build a Report Model and I'd like to deploy it on my production server (Win 2003 server).

I can't use Visual Studio to deploy since I don't have web access to the server (only Remote Desktop access)

The command line tool RS seems to work for a simple report (rdlc) but I have a model (smdl)...

Hi

Did you find a solution to this problem?

Cheers

|||

Not sure this is exactly what you arelooking for, I tried this tool and it solved my problem,

Reporting Services Scripter -http://www.sqldbatips.com/showarticle.asp?ID=62

Overview

Reporting Services Scripter is a .NET Windows Forms application that enables scripting and transfer of all Microsoft SQL Server Reporting Services catalog items to aid in transferring them from one server to another. It can also be used to easily move items on mass from one Reporting Services folder to another on the same server. Depending on the scripting options chosen, Reporting Services Scripter can also transfer all catalog item properties such as Descriptions, History options, Execution options (including report specific and shared schedules), Subscriptions (normal and data driven) and server side report parameters.

[.NET and SQL Server 2000] Explicit order Insert

My current project, which I am programming in .NET, requires me to insert a
variable number of rows, which make up a set, in a specific order. A
collection of sets that are inserted one after the other is a batch. Rows in
each set MUST be kept together, and sets in each batch MUST be kept together
.
What is the best way to implement this, both on the .NET side and also the
SQL Server side? Should I lock the table from inserts and updates (updating
won't be a problem, but inserting will be) before I start inserting rows? Ho
w
would I implement a system that would roll back all the inserts that have
occured in that batch if an error occurs?
Thank you very much,
Yohan MacDonaghLooking more into it, it looks like the best way is to use the DataSet and
DataAdapter objects in .NET.
Can anyone answer this, however: when a datasource is being updated via a
DataAdapter, is the table locked from inserts during the update?
"Yohan" wrote:

> My current project, which I am programming in .NET, requires me to insert
a
> variable number of rows, which make up a set, in a specific order. A
> collection of sets that are inserted one after the other is a batch. Rows
in
> each set MUST be kept together, and sets in each batch MUST be kept togeth
er.
> What is the best way to implement this, both on the .NET side and also the
> SQL Server side? Should I lock the table from inserts and updates (updatin
g
> won't be a problem, but inserting will be) before I start inserting rows?
How
> would I implement a system that would roll back all the inserts that have
> occured in that batch if an error occurs?
> Thank you very much,
> Yohan MacDonagh|||"Yohan" <Yohan@.discussions.microsoft.com> wrote in message
news:A0C46B5C-6B07-46F1-9203-D3B4EDDD341F@.microsoft.com...
> My current project, which I am programming in .NET, requires me to insert
> a
> variable number of rows, which make up a set, in a specific order. A
> collection of sets that are inserted one after the other is a batch. Rows
> in
> each set MUST be kept together, and sets in each batch MUST be kept
> together.
> What is the best way to implement this, both on the .NET side and also the
> SQL Server side? Should I lock the table from inserts and updates
> (updating
> won't be a problem, but inserting will be) before I start inserting rows?
> How
> would I implement a system that would roll back all the inserts that have
> occured in that batch if an error occurs?
A subset of rows must be defined in terms of shared column values. So give
each row a BatchID and a SetID.
David|||Unfortuantly, I cannot. I am limited by an existing (and very old) data
schema. There are no relationships. Each property of an object in .NET is a
new row in this schema (very weird, I know).
"David Browne" wrote:

> "Yohan" <Yohan@.discussions.microsoft.com> wrote in message
> news:A0C46B5C-6B07-46F1-9203-D3B4EDDD341F@.microsoft.com...
> A subset of rows must be defined in terms of shared column values. So giv
e
> each row a BatchID and a SetID.
> David
>
>

[.NET and SQL Server 2000] Explicit order Insert

My current project, which I am programming in .NET, requires me to insert a
variable number of rows, which make up a set, in a specific order. A
collection of sets that are inserted one after the other is a batch. Rows in
each set MUST be kept together, and sets in each batch MUST be kept together.
What is the best way to implement this, both on the .NET side and also the
SQL Server side? Should I lock the table from inserts and updates (updating
won't be a problem, but inserting will be) before I start inserting rows? How
would I implement a system that would roll back all the inserts that have
occured in that batch if an error occurs?
Thank you very much,
Yohan MacDonagh
Looking more into it, it looks like the best way is to use the DataSet and
DataAdapter objects in .NET.
Can anyone answer this, however: when a datasource is being updated via a
DataAdapter, is the table locked from inserts during the update?
"Yohan" wrote:

> My current project, which I am programming in .NET, requires me to insert a
> variable number of rows, which make up a set, in a specific order. A
> collection of sets that are inserted one after the other is a batch. Rows in
> each set MUST be kept together, and sets in each batch MUST be kept together.
> What is the best way to implement this, both on the .NET side and also the
> SQL Server side? Should I lock the table from inserts and updates (updating
> won't be a problem, but inserting will be) before I start inserting rows? How
> would I implement a system that would roll back all the inserts that have
> occured in that batch if an error occurs?
> Thank you very much,
> Yohan MacDonagh
|||"Yohan" <Yohan@.discussions.microsoft.com> wrote in message
news:A0C46B5C-6B07-46F1-9203-D3B4EDDD341F@.microsoft.com...
> My current project, which I am programming in .NET, requires me to insert
> a
> variable number of rows, which make up a set, in a specific order. A
> collection of sets that are inserted one after the other is a batch. Rows
> in
> each set MUST be kept together, and sets in each batch MUST be kept
> together.
> What is the best way to implement this, both on the .NET side and also the
> SQL Server side? Should I lock the table from inserts and updates
> (updating
> won't be a problem, but inserting will be) before I start inserting rows?
> How
> would I implement a system that would roll back all the inserts that have
> occured in that batch if an error occurs?
A subset of rows must be defined in terms of shared column values. So give
each row a BatchID and a SetID.
David
|||Unfortuantly, I cannot. I am limited by an existing (and very old) data
schema. There are no relationships. Each property of an object in .NET is a
new row in this schema (very weird, I know).
"David Browne" wrote:

> "Yohan" <Yohan@.discussions.microsoft.com> wrote in message
> news:A0C46B5C-6B07-46F1-9203-D3B4EDDD341F@.microsoft.com...
> A subset of rows must be defined in terms of shared column values. So give
> each row a BatchID and a SetID.
> David
>
>

[.NET and SQL Server 2000] Explicit order Insert

My current project, which I am programming in .NET, requires me to insert a
variable number of rows, which make up a set, in a specific order. A
collection of sets that are inserted one after the other is a batch. Rows in
each set MUST be kept together, and sets in each batch MUST be kept together.
What is the best way to implement this, both on the .NET side and also the
SQL Server side? Should I lock the table from inserts and updates (updating
won't be a problem, but inserting will be) before I start inserting rows? How
would I implement a system that would roll back all the inserts that have
occured in that batch if an error occurs?
Thank you very much,
Yohan MacDonaghLooking more into it, it looks like the best way is to use the DataSet and
DataAdapter objects in .NET.
Can anyone answer this, however: when a datasource is being updated via a
DataAdapter, is the table locked from inserts during the update?
"Yohan" wrote:
> My current project, which I am programming in .NET, requires me to insert a
> variable number of rows, which make up a set, in a specific order. A
> collection of sets that are inserted one after the other is a batch. Rows in
> each set MUST be kept together, and sets in each batch MUST be kept together.
> What is the best way to implement this, both on the .NET side and also the
> SQL Server side? Should I lock the table from inserts and updates (updating
> won't be a problem, but inserting will be) before I start inserting rows? How
> would I implement a system that would roll back all the inserts that have
> occured in that batch if an error occurs?
> Thank you very much,
> Yohan MacDonagh|||"Yohan" <Yohan@.discussions.microsoft.com> wrote in message
news:A0C46B5C-6B07-46F1-9203-D3B4EDDD341F@.microsoft.com...
> My current project, which I am programming in .NET, requires me to insert
> a
> variable number of rows, which make up a set, in a specific order. A
> collection of sets that are inserted one after the other is a batch. Rows
> in
> each set MUST be kept together, and sets in each batch MUST be kept
> together.
> What is the best way to implement this, both on the .NET side and also the
> SQL Server side? Should I lock the table from inserts and updates
> (updating
> won't be a problem, but inserting will be) before I start inserting rows?
> How
> would I implement a system that would roll back all the inserts that have
> occured in that batch if an error occurs?
A subset of rows must be defined in terms of shared column values. So give
each row a BatchID and a SetID.
David|||No such thing as an ordered INSERT. What you need to do is add a batch
number or othetr identifier to tell you what batches belong together.
--
David Portas
SQL Server MVP
--|||Unfortuantly, I cannot. I am limited by an existing (and very old) data
schema. There are no relationships. Each property of an object in .NET is a
new row in this schema (very weird, I know).
"David Browne" wrote:
> "Yohan" <Yohan@.discussions.microsoft.com> wrote in message
> news:A0C46B5C-6B07-46F1-9203-D3B4EDDD341F@.microsoft.com...
> > My current project, which I am programming in .NET, requires me to insert
> > a
> > variable number of rows, which make up a set, in a specific order. A
> > collection of sets that are inserted one after the other is a batch. Rows
> > in
> > each set MUST be kept together, and sets in each batch MUST be kept
> > together.
> >
> > What is the best way to implement this, both on the .NET side and also the
> > SQL Server side? Should I lock the table from inserts and updates
> > (updating
> > won't be a problem, but inserting will be) before I start inserting rows?
> > How
> > would I implement a system that would roll back all the inserts that have
> > occured in that batch if an error occurs?
> A subset of rows must be defined in terms of shared column values. So give
> each row a BatchID and a SetID.
> David
>
>|||In that case please explain what you mean by a batch being "kept
together". Are you referring to an IDENTITY column here? Please post
DDL and sample data so that we can understand the problem:
http://www.aspfaq.com/etiquette.asp?id=5006
--
David Portas
SQL Server MVP
--

Thursday, February 9, 2012

@@IDENTITY question

Hi, friends. I got a question for @.@.IDENTITY when I was working on a C#.net project.

I use "SELECT @.@.IDENTITY, @.@.ERROR" in the stored procedure to retrieve the ID column of the row just inserted, and in my C# code, I try to access it with rdr.GetInt64(0) since ID column is bigint. However, there is a error. The type is not match. I must instead use rdr.GetDecimal(0) to access @.@.IDENTITY which is a bigint.

I got confused, anybody has any idea?

Thanks.

xufff::I got confused, anybody has any idea?

Sure. Get used to reading the documentation.

@.@.IDENTITY is a (documented, btw) variable that is defined by SQL Server. It is not created based on your data type in the ID column, it is predefined.

The documentation says that the data type of the @.@.IDENTITY variable is - numeric, which translates to decimal in the CLR.

So, according to the documentation (which I checked not to post garbage - took me 10 seconds) this is simply the expected behavior.

I would think that the reason for this is that it IS legal to have a decimal based ID field. Unusual, but legal. And they decided to use numeric for the variable type, simply because this is about the "largest" data type they can use, handling everything allowed for identity fields.

Now, all you have to do is convert this decimal to a bigint - which should not give you any problems.