Showing posts with label insert. Show all posts
Showing posts with label insert. Show all posts

Friday, March 23, 2012

Install Error with Sept CTP

Hi all,
I am running into install error when prompted "PLEASE INSERT DISC
2" when I am installing with virtual drive using Daemon Tools instead of from
CD or DVD. I am a Universal MSDN Download Subscriber, but apparently MSDN don't support install issue with beta version of their product. I am stuck with not able to use this app, and any help is much appreciated.

Sincerely,
-Lawrence
Lawrence, A possible problem is that you didn't notice that some builds of Sept CTP come as two separate images - Server and Tools. I assume you downloaded Server only. Andrew Watt MVP - InfoPath wrote in message news:4aa66c60-7142-4d86-ba1c-77895be5c54d@.discussions.microsoft.com...
> Hi all,
> I am running into install error when prompted "PLEASE INSERT DISC
> 2" when I am installing with virtual drive using Daemon Tools instead of
> from
> CD or DVD. I am a Universal MSDN Download Subscriber, but apparently
> MSDN don't support install issue with beta version of their product. I
> am stuck with not able to use this app, and any help is much
> appreciated. >
> Sincerely,
> -Lawrence|||you need to download both .ISO files, servers and tools. Just load the tools .iso file in the same drive, and you should be fine.
|||To be clear, there are two files you need to download:

Disk1: en_SQL2005_DEV_Servers_Sept2005.iso
Disk2: en_SQL2005_DEV_Tools_Sept2005.iso

You need both of them.|||Great! I did not know Tools ISO would also be needed together with the Server ISO. Typically installation uses one ISO, and Server contains all necessary components. Was that mentioned in MSDN that Tools is also required? This has caused considerable confusion.

Thanks for your assistance!
-Lawrence

Friday, March 9, 2012

INSERT-SELECT depending on the Select:ed order

I'm doing a INSERT...SELECT where I'm dependent on the records SELECT:ed to be in a certain order. This order is enforced through a clustered index on that table - I can see that they are in the proper order by doing just the SELECT part.

However, when I do the INSERT, it doesn't work (nothing is inserted) - can the order of the records from the SELECT part be changed internally on their way to the INSERT part, so to speak?

Actually - it is a view that I'm inserting into, and there's an instead-of-insert trigger on it that does the actual insertions into the base table. I've added a "PRINT" statement to the trigger code and there's just ONE record printed (there should be millions).This sure looks like a "tip of the iceberg" thread...

You can include an ORDER BY statement in your INSERT code, at the end of statement where you would normally place it.

Your trigger is only going to fire ONCE per transaction, no matter how many records are in the transaction. Insert 1000 individual records and the trigger fires 1000 times, but do a single insert of 1000 records and your trigger will only fire once.

Now the big question...what the heck are you doing that requires ordered inserts into a view with an INSTEAD OF trigger? Such complexity is rarely necessary.|||Thanks; I noticed that the records "sent" from the SELECT to the INSERT appeared in reversed order. And, I though the trigger was fired once per record. I've created an SP to take care of it.

The requirement on the order is because this table keeps track of how users logon to an application and then logoff. I'm changing the table from a structure where each such event has its own table row, to a structure where both events are stored in the same row (one datetime column for the logons and one for the logoffs). When transferring data from the old table to the new, I'm assuming that the Logon records are read before the Logoff records, since the Logon records mean an INSERT into the new table while the Logoff records mean an UPDATE of an already existing row (having a Logon date registered).|||You should probably just include a datetime value in your recordset rather than relying on record order.

Inserts taking longer time

Hello All,

I have SQL Server 2005 installed on my machine and I am firing following query to insert 1500 records into a simple table having just on column.

Declare @.i int
Set @.i=0
While (@.i<1500)
Begin
Insert into test2 values (@.i)
Set @.i=@.i+1
End

Here goes the table definition,

CREATE TABLE [dbo].[test2](
Idea [int] NULL
) ON [PRIMARY]

Now the problem with this is that on one of my server this query is taking just 500ms to run while on my production and other test server this query is taking more than 25 seconds.
Same is the problem with updates. I have checked the configurations of both the servers and found them to be the same. Also there are no indexes defined on either of the tables. I was wondering what can be the possible reason for this to happen. If any of u people has any pointers regarding this, will be really useful Smile

Thanks in advance,
Mitesh

Although this can′t be the possible cause, which recovery models are you using on the databases ?

Jens K. Suessmeyer.

http://www.sqlserver2005.de
|||Hey Jens,
Thanks for the pointer but we are using the Full Recovery model on both of my database. I think there is some problem with the server configuration as I have tried this Insert on multiple databases of my server (where the time taken is more) and have got the same output Sad
|||

You machine is probably caching disk writes so the commits are not actually hardening.Servers are usually configured not to do that so your data is not at risk.

You should wrap your loop with a BEGIN TRAN and COMMIT TRAN.

Declare @.i int
Set @.i=0

Begin Tran
While (@.i<1500)
Begin
Insert into test2 values (@.i)
Set @.i=@.i+1
End
Commit Tran

|||

Hi Mitesh,

I am having a very similar problem:

* I have a crappy testing machine (P3 at 0.5 GHz on Windows NT Server) with SQL 7,

* I have a testing environment running on a Virtual Machine on a pretty descent laptop (Laptop = Celeron @. 1.59 GHz on Windows XP - sorry, low budget -). The Virtual Machine runs Windows 2003 with SQL 2005 Express, with 400 MBytes assigned in RAM.

* The database I work with is created running an SQL script. It has about 15 tables, and as a whole is a quite normalized database.

* I have this application that I did, that updates the database through ODBC.

Now here is the interesting data:

* On my SQL 7, it takes between 2 to 3 seconds to update the database (we are talking about 50 to 100 insert statements) the first time I click on the "save" button. After that is just a matter of a little bit less than a second to update the database every time I click on the "save" button.

* However, on my SQL 2005 environment, it takes about 1 minute. The amount of time does not change much the next time I click "save".

Notes aside:

* Both are empty databases. They only have the table structure.

* The tables are quite normalized. It is ok, it is not a database that it is intended to grow beyond 100 Mbytes.

* "De-normalization" is not an option.

* I changed the size of the database log file from 1 Mbyte to 20 to see if my problem were excesive "check points"... but the behaviour was just the same.

I learned to live with it, after all, Windows XP does not support VM's. In fact, I was rather impressed by the Celeron chip.

But the fact that caught my attention is that, after time went by and I continued using my Virtual Machine, the time got reduced as my database grew up in size. After about 100 "save" operations, It now takes around 10 seconds to save. This suggests me that there is something going on with the indexes...

It is ok, after all, it is just a testing environment.

Hope it helps!!!

Tark Ik

PS: This posting has been posted "AS IS"

|||Hey Peter,
Thanks a Ton for your reply. There would have been no way in the world through which i could have figured this out Smile. The other thing I want to ask is that how can we disable the caching on our server. I mean could not exactly figure out how did the begin tran and commit solved the purpose. Will be really helpful if you can tell how to disable the caching and hardening and is this safe.

Inserts resulting in Exclusive Key Lock

I'm relatively new to SQL and I've come across something that doesn't seem quite right. When an insert becomes part of an transaction I notice an exclusive KEY lock in Enterprise Manager. The table in question was using a Clustered index but I changed that, dropped the table and brought it back in but I still get the lock which keeps all others out of the table. Is this the expected behavior or is there something I am missing? Could the size of the tabe affect things? This is a very small table currently. I'm using MSSQL 7 sp3.

Thanks,
JohnWell put the clustered index back...

It's locking the row and/or page..

Did 7 have row? And was it a 2k or 8k page?

I can't remeber any more,

BUT, you need to keep your transactions brief...like lightning breif...

What's the code do?

Is this a sproc?|||Yes, 7 has row locking, not sure about the page size. I am converting and old Btrieve app and trying to keep modifications to the code to a minimum. This is acutally happening with only one user in. Within a tranasction its reading a table, inserting a row, then cycyling again - multiple times. After the first insert its not able to read the table again. Before I went in and checked if the code can be changed I wanted to make sure there isn't something else I could do - aside from changing the isolation level.

Thanks

Originally posted by Brett Kaiser
Well put the clustered index back...

It's locking the row and/or page..

Did 7 have row? And was it a 2k or 8k page?

I can't remeber any more,

BUT, you need to keep your transactions brief...like lightning breif...

What's the code do?

Is this a sproc?|||You mean looping?

You're probably blocking yourself...

Are you doing INSERT INTO the table you're reading from?

Why not write to a temp table, then if all the executions are fine, insert the temp to the final destination table...

can you post the code?|||Yes, I am blocking myself...

Would that be a user temp table? Will give that a try. Can't really post the code as its being generated from an application called Magic.

Even with the temp, when inserting the temp to the final dest, won't that still result in excl key locks? - although in another transaction.

Is this just the way insert locks work - has behavior changed in 2000?

Originally posted by Brett Kaiser
You mean looping?

You're probably blocking yourself...

Are you doing INSERT INTO the table you're reading from?

Why not write to a temp table, then if all the executions are fine, insert the temp to the final destination table...

can you post the code?|||Magic...that's funny...

Let's make some Magic

CREATE TABLE #myTable99(Col list)

DECLARE @.x int

SELECT @.x = 0

WHILE @.x < 10
BEGIN
SELECT @.loclavariable1 = Col1, ect FROM TABLE WHERE Predicate = guarentee 1 row
INSERT INTO #myTable99(Col1, ect) SELECT @.loclavariable1, ect
SELECT @.x = @.x + 1
INSERT INTO FinalDetaTable (Collist)
SELECT Col list FROM #myTable99

Like that...or are you using a cursor?

If it is a cursor, you can probably do without it...

Hey, I know, I'll create an app and call it Miracle

Good Luck|||Oh for Pete's sake!...|||What's wrong, my advice or the miracle thing?

Some say it would be a miracle if I ever get this third party extract done...

They keep chenging their mind...

I'm on revision 12 now...purely painful...for a simple (ok, well not so simple) extract...|||Hey Brett, you rock! I just don't understand why people are trying to write a script to send a rocket to Mars while doing an insert?! ;)|||Sounds like a 4GL that generates the code...

Ever see SQ-R

Painful...

I guess you can you it and write good code, but it leads down the cursor path..

I once saw a "program" that ran 7 hours..

8 levels of nested curors...

Wrote in pure sql...

5 minutes...|||LOL

So in actuallity, it really was my advice

:D

Inserts gradually slow down

I'm inserting 2 million+ records from a C# routine which starts out
very fast and gradually slows down. Each insert is through a stored
procedure with no transactions involved.

If I stop and restart the process it immediately speeds up and then
gradually slows down again. But closing and re-opening the connection
every 10000 records didn't help.

Stopping and restarting the process is obviously clearing up some
resource on SQL Server (or DTC??), but what? How can I clean up that
resource manually?BTW, the tables have no indexes or constraints. Just simple tables
that are dropped and recreated each time the process is run.|||(andrewbb@.gmail.com) writes:
> I'm inserting 2 million+ records from a C# routine which starts out
> very fast and gradually slows down. Each insert is through a stored
> procedure with no transactions involved.
> If I stop and restart the process it immediately speeds up and then
> gradually slows down again. But closing and re-opening the connection
> every 10000 records didn't help.
> Stopping and restarting the process is obviously clearing up some
> resource on SQL Server (or DTC??), but what? How can I clean up that
> resource manually?

If I understand thius correctly, every time you restart the process
you also drop the tables and recreate them. So that is the "resource"
you clear up.

One reason could be autogrow. It might be an idea to extend the database
to reasonable size before you start loading. If you are running with
full recovery, this also includes the transaction log.

You could also consider adding a clustered index that is aligned with
the data that you insert. That is, you insert the data in foo-order,
you should have a clustered index on foo.

But since 2 million rows is quite a lot, you should probably examine
more efficient methods to load them. The fastest method is bulk-load,
but ADO .Net 1.1 does not have a bulk-load interface. But you could
run command-line BCP.

You could also build XML strings with your data and unpack these
with OPENXML in the stored procedure.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||In my experience 9 times out of 10 the reason for the problem is a file
that keeps growing by 10%.
either pre-allocate a big file as Erland suggests or change the
filegrowth from 10% (the default, which by the way is a bad default) to
something in the region of 20 mb or so.|||You're right, I removed the drop and re-create and it's definitely
slower when data already exists. So how would you suggest loading this
data?

The text file contains 11 different types of Rows. Each type of row
goes to a separate table, so I need to read each line, determine its
type, parse it and insert into the appropriate table.

Can BCP handle this? DTS? Or your XML idea?

Thanks

Erland Sommarskog wrote:
> (andrewbb@.gmail.com) writes:
> > I'm inserting 2 million+ records from a C# routine which starts out
> > very fast and gradually slows down. Each insert is through a
stored
> > procedure with no transactions involved.
> > If I stop and restart the process it immediately speeds up and then
> > gradually slows down again. But closing and re-opening the
connection
> > every 10000 records didn't help.
> > Stopping and restarting the process is obviously clearing up some
> > resource on SQL Server (or DTC??), but what? How can I clean up
that
> > resource manually?
> If I understand thius correctly, every time you restart the process
> you also drop the tables and recreate them. So that is the "resource"
> you clear up.
> One reason could be autogrow. It might be an idea to extend the
database
> to reasonable size before you start loading. If you are running with
> full recovery, this also includes the transaction log.
> You could also consider adding a clustered index that is aligned with
> the data that you insert. That is, you insert the data in foo-order,
> you should have a clustered index on foo.
> But since 2 million rows is quite a lot, you should probably examine
> more efficient methods to load them. The fastest method is bulk-load,
> but ADO .Net 1.1 does not have a bulk-load interface. But you could
> run command-line BCP.
> You could also build XML strings with your data and unpack these
> with OPENXML in the stored procedure.
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp|||I tried adjusting the settings... both different %'s and specific MBs,
but the slow down is the same in all cases.

It's dramatically slower to insert records into a 100,000 row table
than an empty one I guess.|||
andrewbb@.gmail.com wrote:

> I tried adjusting the settings... both different %'s and specific MBs,
> but the slow down is the same in all cases.
> It's dramatically slower to insert records into a 100,000 row table
> than an empty one I guess.

So these are existing tables? Do these tables have indexes already? If you
had a clustered index, for instance, and were inserting data in a random
order, there would be a lot of inefficiency and index maintenance. The
fastest way to get that much data in, would be to sort it by table,
and in the order you'd want the index, and BCP it in, to tables with no
indices, and then create your indexes.|||(andrewbb@.gmail.com) writes:
> You're right, I removed the drop and re-create and it's definitely
> slower when data already exists. So how would you suggest loading this
> data?

I can't really give good suggestions about data that I don't anything
about.

> The text file contains 11 different types of Rows. Each type of row
> goes to a separate table, so I need to read each line, determine its
> type, parse it and insert into the appropriate table.
> Can BCP handle this? DTS? Or your XML idea?

If the data has a conformant appearance, you could load the lot in a staging
table and then distribute the data from there.

You could also just write new files for each table and then bulk-load
these tables.

It's possible that a Data Pump task in DTS could do all this out
of the box, but I don't know DTS.

The XML idea would require you parse the file, and build an XML document
of it. You wouldn't have to build 11 XML documents, though. (Although
that might be easier than building one big one.)

It's also possible to bulk-load from variables, but not in C# with
ADO .Net 1.1.

So how big did you make the database before you started loading? With
two million records, you should have at least 100 MB for both data and
log.

By the way, how do call the stored procedure? You are using
CommandType.StoredProcedure, aren't you?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Thanks for the help. I found the fastest way to do it in a single SP:

- BULK INSERT all data into a 2 column staging table using a BCP Format
file. (One column is the RowType, the rest is the data to be parsed)
- create an index on RowType
- call 11 different SELECT INTO statements based on RowType

2,000,000 rows loaded in 1.5 minutes.

Thanks a lot, BCP works very well.|||(andrewbb@.gmail.com) writes:
> Thanks for the help. I found the fastest way to do it in a single SP:
> - BULK INSERT all data into a 2 column staging table using a BCP Format
> file. (One column is the RowType, the rest is the data to be parsed)
> - create an index on RowType
> - call 11 different SELECT INTO statements based on RowType
> 2,000,000 rows loaded in 1.5 minutes.
> Thanks a lot, BCP works very well.

Hmmm! It's always great to hear when things work out well!

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Insert-Retrive-Update in Text datatype.

Can anyone guide how to retrieve, update and insert data in column having 'Text' datatype? I am a newbie and i havent' done this yet.

Thanks in advance.

If you are using SQL Server 2005 consider using a VARCHAR(MAX) or NVARCHAR(MAX) instead of a text datatype. Look up the VARCHAR(MAX) and TEXT datatypes in books online.|||

No. i am using sql 2000 and i wanted to know the method of retriving and updating data usign text datatypes.

thanksk,

|||Use it in the same manner that you would use a normal varchar column. The only difference is that when using it in a where condition you can't use the '=' operator. You have to use 'like'. Other than that inserting, updating and retrieving is exactly the same.

Best regards,
Sami Samir|||

Please see the links below for some samples. For text/ntext/image value manipulation, you can use UPDATETEXT/WRITETEXT/READTEXT or SELECT/INSERT/UPDATE statements depending on your needs. There are functions like DATALENGTH and PATINDEX that will work with these data types too.

http://www.umachandar.com/technical/SQL6x70Scripts/Main53.htm

http://www.umachandar.com/technical/SQL6x70Scripts/Main63.htm

http://www.umachandar.com/technical/SQL6x70Scripts/Main12.htm

Insertparameters in code-behind

Hi,

How do I use insertparameters in code-behind? This is the code that I have

Dim insertSqlAs StringinsertSql ="INSERT INTO [xyzTable] ([x], [y]) VALUES (@.x, @.y)"SqlDataSource1.InsertCommand = insertSqlSqlDataSource1.InsertParameters.Add("@.x","124")SqlDataSource1.InsertParameters.Add("@.y","456")SqlDataSource1.Insert()

When I execute these line of code I get an error saying

"Must declare the variable '@.x'."

Please help

I assume your code block is under button click event:

Protected Sub Button1_Click(ByVal senderAs Object,ByVal eAs System.EventArgs)Handles Button1.Click

Dim insertSqlAs String
insertSql ="INSERT INTO [xyzTable] ([x], [y]) VALUES (@.x, @.y)"
SqlDataSource1.InsertCommand = insertSql
SqlDataSource1.InsertParameters.Add("x","124")
SqlDataSource1.InsertParameters.Add("y","456")
SqlDataSource1.Insert()

End Sub

ADO.NET parameter and ASP.NET parameter are different. ADO.NET parameters always start with @. sign.

|||That simple -- just remove the @. sign. Thanks for your help

Insertion with single quotes problem

I have a problem with inserting a string with single quotes. For instance,

string testme = "we don't have anything";

insert into tableone (buff) values ("'" + testme + "'");

I get an error with the word "don't" with single quote. But if I delete the single quote "dont" then it is okay. Is is a bug in sql 2005? Please help. Thanks.

blumonde

The best bet is to use parameters. Building the SQL String as you are can cause all sorts of problems.

Absent that, you need to "Escape" the apostrophe. SO, do the following...

string testme = "we don''t have anything";

Note there are 2 apostrophes in a row...

|||

douglas.reilly:

The best bet is to use parameters. Building the SQL String as you are can cause all sorts of problems.

Absent that, you need to "Escape" the apostrophe. SO, do the following...

string testme = "we don''t have anything";

Note there are 2 apostrophes in a row...

Hi Douglas,

The problem is that end-users write those statements and hit insert. And they don't type two apostrophes. I can't "escape" it. Thanks.

blumonde

|||

You then have two choices.

1.Use parameters.

2. Search the string for ' and replace it with '' (two apostrophes). Of course users are not going to use two apostrophes.

|||

douglas.reilly:

You then have two choices.

1.Use parameters.

2. Search the string for ' and replace it with '' (two apostrophes). Of course users are not going to use two apostrophes.

I will try using parameter first. Thanks.

blumonde

|||

Parameters did it for me. Thanks.

blumonde

Insertion With Explicit IDENTITY

Hi,
I'm writing a database interface to someone else's code and am getting passed INSERT statements that look something like this:

INSERT INTO my_table(my_id, my_val)
VALUES (%s, 5)

My code is expected to replace the %s with a suitable value for an ID. Now, my_table looks like this:

CREATE TABLE my_table (
my_id INT IDENTITY(1,1) PRIMARY KEY,
my_val INT
)

and I'm trying to get a suitable value for the '%s' such that the insert will work correctly. I understand that I need to 'SET IDENTITY_INSERT my_table ON' prior to the insert, but what can I replace '%s' with? I can't put a separate SELECT statement in there, and @.@.IDENTITY is too global to be useful (it's a multithreaded app with a lot of inserts across multiple tables). Hacking the input string to remove the 'my_id, ' and '%s, ' completely is not allowed (unfortunately).

I've tried NULL in the hope that SQLServer will work it out but it complains. I don't want to do a 'SELECT IDENT_CURRENT('my_table')' prior to the INSERT due to the overhead and potential concurrency problems. Is there some special keyword or similar I can put in here?

DB is SQLServer2000 SP2. Any help is greatly appreciated.

Cheers,
Jim.Sounds like you have an application that was designed to run against diffrent vendor databases OR someone has missed the design boat. I don't understand why you would define a table with an identity attribute AND have a requirment that you pass a value for the identity value.

To my knowledge there is no way to pass a place hold in the fashion you have described.|||Originally posted by Paul Young
Sounds like you have an application that was designed to run against diffrent vendor databases OR someone has missed the design boat. I don't understand why you would define a table with an identity attribute AND have a requirment that you pass a value for the identity value.

The reason is that the app talks to a number of different back-end databases and they all have different ways of doing an insert on a table with a unique ID. For instance, Postgres expects to have nextval('my_table_f_id_seq') or similar in place of the %s. I'm just stuck (really stuck, as it turns out) with coding to the supplied spec.

Thanks for the reply.|||SET IDENTITY_INSERT my_table ON

INSERT INTO my_table(my_id,my_val)
VALUES (IDENT_CURRENT('my_table') + 1, 5)

SET IDENTITY_INSERT my_table OFF

May want to replace the + 1 with call to function IDENT_INCR()

Tim S|||Tim, that works a treat. Thanks!

Insertion data via Stored Procedure [URGENT]!

Hi all;

Question:
=======
Q1) How can I insert a record into a table "Parent Table" and get its ID (its PK) (which is an Identity "Auto count" column) via one Stored Procedure??

Q2) How can I insert a record into a table "Child Table" whose (FK) is the (PK) of the "Parent Table"!! via another one Stored Procedure??

Example:
----
I have two tables "Customer" and "CustomerDetails"..

SP1: should insert all "Customer" data and return the value of an Identity column (I will use it later in SP2).

SP2: should insert all "CustomerDetials" data in the record whose ID (the returned value from SP1) is same as ID of the "Customer" table.

FYI:
--
MS SQL Server 2000
VS.NET EA 2003
Win XP SP1a
VB.NET/ASP.NET :)

Thanks in advanced!There are a couple of ways to get the last inserted IDENT, but I prefer to use IDENT_CURRENT('table_name') to get the ID from the last inserted record of thespecified table.


INSERT INTO Customer (val1,val2,etc) VALUES (@.Val1,@.val2,etc)

INSERT INTO CustomerDetails (ID,Val1,Val2, etc) VALUES ((IDENT_CURRENT('Customer'),@.val1,@.val2,etc)

|||Ok, what about if I want to do these process in two SPs??

I want to take the IDDENTITY value from the fitrst "INSERT INTO" statment, because I need this value in my source code as well as other Stored Procedure(s).

Thanks in advanced!|||Insert Blah...;
Return SCOPE_IDENTITY()|||Thanks gays.

I think my problem is how to get the returned value (IDDENTITY value) from the VB.NET code (in other words, how to extract it from VB.NET/ADO.NET code)?

Note:
--
VB.NET/ADO.NET
or
C#.NET/ADO.NET

Are are ok, if you would like to demonstrate your replay. ( I want the answer!).

Thanks again.|||YES!!

The problem was from my ADO.NET part.

Thanks for your help gays.

Wednesday, March 7, 2012

Insertion / updation problem in SSIS

“I have a scenario where i am trying to insert 200,000 lac records & update 200,000 lac record in destination table using SISS Package (SQL SERVER 2005) but I am not able to neither update nor insert the records . while executing the package its not showing any error also . what could be the problem ? “

We have business logic in Package creation 1) Insert New records and 2) Update Existing Records using the follow Data flow diagram

For update we are using OLEDB command, for insert we are using OLEDB Destination.

We are using merge join for spliting record into insert and update.

Perhaps there is blocking on the destination table. This can often happen if you're attempting 2 operations simultaneously.

Execute sp_who2 to see if there's any blocking going on.

-Jamie

|||

is there any other solution for this problem, is this not possible to run for achive both insertion and updation in the same package? if i try to run the package for less records, it is succeded, if i try to run the package for more records, then same problem coming again and again

Thanks & Regards

S.Nagarajan

|||

In that case I am even more sure that blocking is a problem. Did you bother to execute sp_who2 like I advised?

There is an easy fix to this problem. Continue to do the insert but push teh adta to be updated into a raw file. You can then use the contents of the raw file in another data-flow in order to do the update,.

-Jamie

|||I ran sp_who2 and found the blocking, how can i remove the blocking, is there any query to remove the blocking. I couldn't get your solution clearly, please brief me your alternate solution.|||

Do you know what raw files are? If not, go away and study them. When you have finished read #1 here: http://blogs.conchango.com/jamiethomson/archive/2006/02/17/2877.aspx

It describes a different scenario for using raw files but the usage is the same.

-Jamie

|||

We have completed upto move to flat destination files using raw files, could you please help to move this flat files data (for update) to database. what logic we need to use? is it required to add dataflow diagram after raw file destination component

|||

Please stop writing the same question in multiple threads simultaneously. People are here to help and don't want to waste their time clicking through and reading the same thing more than once.

-Jamie

Inserting, updating Record having Single, double quotes.

Hi all
I need to insert some text in a table that contains single as well as double
quotes but its return error during inserting or updating.
I converted single as well as double quote to chr(39) and chr(34) but still
facing problem.
Please advise how I can solve it.
Kind RegardsFor double quotes, check if you have SET QUOTED_IDENTIFIER ON; single
quotes hae simply to be duplicated iside the string. Example:
USE tempdb
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE T1
(a varchar(50))
GO
INSERT INTO T1 VALUES ('A single '' apostrophe; and a double " one')
SELECT * FROM T1
DROP TABLE T1
GO
Dejan Sarka, SQL Server MVP
Associate Mentor
www.SolidQualityLearning.com
"F@.yy@.Z" <fayyaz.ahmed@.mvwebmaker.com> wrote in message
news:e68qySBuEHA.1308@.tk2msftngp13.phx.gbl...
> Hi all
> I need to insert some text in a table that contains single as well as
double
> quotes but its return error during inserting or updating.
> I converted single as well as double quote to chr(39) and chr(34) but
still
> facing problem.
> Please advise how I can solve it.
> Kind Regards
>
>
>

Inserting, updating Record having Single, double quotes.

Hi all
I need to insert some text in a table that contains single as well as double
quotes but its return error during inserting or updating.
I converted single as well as double quote to chr(39) and chr(34) but still
facing problem.
Please advise how I can solve it.
Kind RegardsFor double quotes, check if you have SET QUOTED_IDENTIFIER ON; single
quotes hae simply to be duplicated iside the string. Example:
USE tempdb
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE T1
(a varchar(50))
GO
INSERT INTO T1 VALUES ('A single '' apostrophe; and a double " one')
SELECT * FROM T1
DROP TABLE T1
GO
--
Dejan Sarka, SQL Server MVP
Associate Mentor
www.SolidQualityLearning.com
"F@.yy@.Z" <fayyaz.ahmed@.mvwebmaker.com> wrote in message
news:e68qySBuEHA.1308@.tk2msftngp13.phx.gbl...
> Hi all
> I need to insert some text in a table that contains single as well as
double
> quotes but its return error during inserting or updating.
> I converted single as well as double quote to chr(39) and chr(34) but
still
> facing problem.
> Please advise how I can solve it.
> Kind Regards
>
>
>

Inserting, updating Record having Single, double quotes.

Hi all
I need to insert some text in a table that contains single as well as double
quotes but its return error during inserting or updating.
I converted single as well as double quote to chr(39) and chr(34) but still
facing problem.
Please advise how I can solve it.
Kind Regards
For double quotes, check if you have SET QUOTED_IDENTIFIER ON; single
quotes hae simply to be duplicated iside the string. Example:
USE tempdb
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE T1
(a varchar(50))
GO
INSERT INTO T1 VALUES ('A single '' apostrophe; and a double " one')
SELECT * FROM T1
DROP TABLE T1
GO
Dejan Sarka, SQL Server MVP
Associate Mentor
www.SolidQualityLearning.com
"F@.yy@.Z" <fayyaz.ahmed@.mvwebmaker.com> wrote in message
news:e68qySBuEHA.1308@.tk2msftngp13.phx.gbl...
> Hi all
> I need to insert some text in a table that contains single as well as
double
> quotes but its return error during inserting or updating.
> I converted single as well as double quote to chr(39) and chr(34) but
still
> facing problem.
> Please advise how I can solve it.
> Kind Regards
>
>
>

Inserting XML with SSIS - VERY URGENT!!!

Hello everybody,

I have a problem. I need to insert an unknown number of xml files in a database (all files are always in the same folder), in different tables, each file has the same name that the corresponding table. For example:

Files Tables

user.xml user

purchase.xml purchase

...and so

but the number of files is not always the same, I mean, it can be 6 one day and only 4 the next day.
Can I insert the data in the xml files into the tables with a Foreach Loop Container or any other way? If it's possible, how?

Thanks in advance for your help,

Radamante71

You might want to post your question at SQL Server Integration Services forum at http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=80&SiteID=1.

Inserting XML values

" Insert Into...Value" seems to be the command for inserting new data into an XML Doc. I have also seen people using "Insert...Value" without the "Into" keyword. Is there a difference between "Insert" and "Insert Into"?

There is no difference. 'INTO' is optional keyword for INSERT ... VALUES() statement. It's not only for insert xml value, but for all other sql types.|||thank you :-)

inserting XML into msSQL

Hello everyone,
I have a large file about 75,000 records each with about 3 elements and 1
variable.
I need to know how to insert this entire file into an existing table, or new
table or anything as long as it is moved from the XML to the msSQL.
I read into OPENXML and all that in books online, but im having a large
amount of difficulty, seeing as i need to have the data directly in the
query.. is there a way to reference a file or something..
Thanks.The SQL Server Web Services ToolKit has a nice SQLXML BulkInsert provider
that you may find helpful.
058c0bfd7de&DisplayLang=en" target="_blank">http://www.microsoft.com/downloads/...&DisplayLang=en
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"Adam" <adoeler@.sharklogic.com> wrote in message
news:syAOc.202$cE5.1830@.news20.bellglobal.com...
> Hello everyone,
> I have a large file about 75,000 records each with about 3 elements and 1
> variable.
> I need to know how to insert this entire file into an existing table, or
new
> table or anything as long as it is moved from the XML to the msSQL.
> I read into OPENXML and all that in books online, but im having a large
> amount of difficulty, seeing as i need to have the data directly in the
> query.. is there a way to reference a file or something..
> Thanks.
>|||Adam
declare @.list varchar(8000)
declare @.hdoc int
set @.list='<Northwind..Orders OrderId="10643"
CustomerId="ALFKI"/><Northwind..Orders OrderId="10692" CustomerId="ALFKI"/>'
select @.list='<Root>'+ char(10)+@.list
select @.List = @.List + char(10)+'</Root>'
exec sp_xml_preparedocument @.hdoc output, @.List
select OrderId,CustomerId
from openxml (@.hdoc, '/Root/Northwind..Orders', 1)
with (OrderId int,
CustomerId varchar(10)
)
exec sp_xml_removedocument @.hdoc
"Adam" <adoeler@.sharklogic.com> wrote in message
news:syAOc.202$cE5.1830@.news20.bellglobal.com...
> Hello everyone,
> I have a large file about 75,000 records each with about 3 elements and 1
> variable.
> I need to know how to insert this entire file into an existing table, or
new
> table or anything as long as it is moved from the XML to the msSQL.
> I read into OPENXML and all that in books online, but im having a large
> amount of difficulty, seeing as i need to have the data directly in the
> query.. is there a way to reference a file or something..
> Thanks.
>|||the problem with inserting that data into the query, is there is more
than 8000 characters in the entire thing, it would take ages to go and
select 8000 at a time. I need something like navicat that works a little
more solid. navicat seems like it is skipping fields... its all a big
pain in my XXX. hah.
if anyone can help. please do..
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!

inserting XML into msSQL

Hello everyone,
I have a large file about 75,000 records each with about 3 elements and 1
variable.
I need to know how to insert this entire file into an existing table, or new
table or anything as long as it is moved from the XML to the msSQL.
I read into OPENXML and all that in books online, but im having a large
amount of difficulty, seeing as i need to have the data directly in the
query.. is there a way to reference a file or something..
Thanks.The SQL Server Web Services ToolKit has a nice SQLXML BulkInsert provider
that you may find helpful.
http://www.microsoft.com/downloads/details.aspx?FamilyID=ca1cc72b-6390-4260-b208-2058c0bfd7de&DisplayLang=en
--
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"Adam" <adoeler@.sharklogic.com> wrote in message
news:syAOc.202$cE5.1830@.news20.bellglobal.com...
> Hello everyone,
> I have a large file about 75,000 records each with about 3 elements and 1
> variable.
> I need to know how to insert this entire file into an existing table, or
new
> table or anything as long as it is moved from the XML to the msSQL.
> I read into OPENXML and all that in books online, but im having a large
> amount of difficulty, seeing as i need to have the data directly in the
> query.. is there a way to reference a file or something..
> Thanks.
>|||Adam
declare @.list varchar(8000)
declare @.hdoc int
set @.list='<Northwind..Orders OrderId="10643"
CustomerId="ALFKI"/><Northwind..Orders OrderId="10692" CustomerId="ALFKI"/>'
select @.list='<Root>'+ char(10)+@.list
select @.List = @.List + char(10)+'</Root>'
exec sp_xml_preparedocument @.hdoc output, @.List
select OrderId,CustomerId
from openxml (@.hdoc, '/Root/Northwind..Orders', 1)
with (OrderId int,
CustomerId varchar(10)
)
exec sp_xml_removedocument @.hdoc
"Adam" <adoeler@.sharklogic.com> wrote in message
news:syAOc.202$cE5.1830@.news20.bellglobal.com...
> Hello everyone,
> I have a large file about 75,000 records each with about 3 elements and 1
> variable.
> I need to know how to insert this entire file into an existing table, or
new
> table or anything as long as it is moved from the XML to the msSQL.
> I read into OPENXML and all that in books online, but im having a large
> amount of difficulty, seeing as i need to have the data directly in the
> query.. is there a way to reference a file or something..
> Thanks.
>

inserting XML into msSQL

Hello everyone,
I have a large file about 75,000 records each with about 3 elements and 1
variable.
I need to know how to insert this entire file into an existing table, or new
table or anything as long as it is moved from the XML to the msSQL.
I read into OPENXML and all that in books online, but im having a large
amount of difficulty, seeing as i need to have the data directly in the
query.. is there a way to reference a file or something..
Thanks.
The SQL Server Web Services ToolKit has a nice SQLXML BulkInsert provider
that you may find helpful.
http://www.microsoft.com/downloads/d...DisplayLang=en
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"Adam" <adoeler@.sharklogic.com> wrote in message
news:syAOc.202$cE5.1830@.news20.bellglobal.com...
> Hello everyone,
> I have a large file about 75,000 records each with about 3 elements and 1
> variable.
> I need to know how to insert this entire file into an existing table, or
new
> table or anything as long as it is moved from the XML to the msSQL.
> I read into OPENXML and all that in books online, but im having a large
> amount of difficulty, seeing as i need to have the data directly in the
> query.. is there a way to reference a file or something..
> Thanks.
>
|||Adam
declare @.list varchar(8000)
declare @.hdoc int
set @.list='<Northwind..Orders OrderId="10643"
CustomerId="ALFKI"/><Northwind..Orders OrderId="10692" CustomerId="ALFKI"/>'
select @.list='<Root>'+ char(10)+@.list
select @.List = @.List + char(10)+'</Root>'
exec sp_xml_preparedocument @.hdoc output, @.List
select OrderId,CustomerId
from openxml (@.hdoc, '/Root/Northwind..Orders', 1)
with (OrderId int,
CustomerId varchar(10)
)
exec sp_xml_removedocument @.hdoc
"Adam" <adoeler@.sharklogic.com> wrote in message
news:syAOc.202$cE5.1830@.news20.bellglobal.com...
> Hello everyone,
> I have a large file about 75,000 records each with about 3 elements and 1
> variable.
> I need to know how to insert this entire file into an existing table, or
new
> table or anything as long as it is moved from the XML to the msSQL.
> I read into OPENXML and all that in books online, but im having a large
> amount of difficulty, seeing as i need to have the data directly in the
> query.. is there a way to reference a file or something..
> Thanks.
>
|||the problem with inserting that data into the query, is there is more
than 8000 characters in the entire thing, it would take ages to go and
select 8000 at a time. I need something like navicat that works a little
more solid. navicat seems like it is skipping fields... its all a big
pain in my XXX. hah.
if anyone can help. please do..
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!

inserting xml data

Hi,
Is there any way to insert the output of xml_auto into a table

for eg:

select * from categories for xml auto

i need the output of the abouve query to be inserted into another table
the destination table has one column,thomson (saintthomson@.yahoo.com) writes:
> Is there any way to insert the output of xml_auto into a table
> for eg:
> select * from categories for xml auto
> i need the output of the abouve query to be inserted into another table
> the destination table has one column,

I think the only way you can do this in SQL 2000 is to use OPENQUERY:

INSERT tbl (col)
SELECT * FROM OPENQUERY (LOOPBACK,
'SELECT * FROM categories FROM XML AUTO')

Here LOOPBACK is a linked server back to your own, and here is a real
funny thing: you must set it up to use MSDASQL, that is the OLE DB over
ODBC provider! If you use SQLOLEDB which is the recommended provider,
you will get binary data back.

But when I did a quick test, the result was not entirely acceptable, since
the XML string was split up over six rows.

In the next version of SQL Server, SQL 2005, currency in beta, there
are significant enhancements in XML support, including a specific xml
datatype, and you can do this easily.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||xmlbulkload no good?