-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Insert multiple rows in a single command #362
Comments
Hey Eric! Did you find out anything since posting? It sounds like you question is pure T-SQL/Server related. I did a Google search and found this: Is that what you needed? |
Thanks! Yeah that works, and that's what I did first. I use the same method on MySQL and it works fine. But it's extremely slow in SQL*Server for some reason. What ultimately worked for me is to --> Eric |
Ah interesting. I remember back in the day before I discovered |
The advantage of |
Cool... but if it does not work and |
Well, it's bytes plus load on SQL*Server. With From what I can glean from the Java SQLServer driver, there's a way to 'batch' inserts rather than send them individually. Then SQLServer can handle inserting many records at once rather than one at a time, which is more efficient. That's the only part I'm missing. |
I think I'm missing more... like specifically how TinyTDS can do this when we use DBLIB?
Do you have statics and benchmarks that specifically show differences between pure T-SQL |
Hi- I don't know how to do batching without perhaps using a different library or language. No, I didn't do a side-by-side comparison, but: So given a choice between implementing one over the other, I'm going to go with prepare/execute. And I got it to work |
Thanks, it would be super helpful to understand perf more with measured data. Do you have a blog post that someone else has done on this? Anything would be interesting to me. |
No, I don't have a good handle on relative performance. I was surprised to find relatively little documentation about this on the internet. Batch inserting is a relatively well-known pattern. Oh, and I can tell you that:
|
Cool, thanks. I did a write up with some simple benchmarks using https://blog.engineyard.com/2011/sql-server-10xs-faster-with-rails-3-1 What would be really cool is to understand if Ruby ODBC or some apples to apples comparison showed better results in a way that would help make Ruby/TinyTDS/SQLServer community better understand. Without it, it is hard to tell what I can do, if anything. Thanks so much for sharing what you have! |
What is the actual (example) sql you're running in both cases? If I understand your point, it's that something like: declare @handle int
exec sp_prepare @handle output, '@a int, @b int', 'insert (a,b) values (@a, @b)'
exec sp_execute @handle, 1, 2
exec sp_execute @handle, 2, 3
exec sp_execute @handle, 3, 4 is 12x faster than: exec sp_executesql 'insert (a,b) values (1,2), (2,3), (3,4)' |
Yeah, that's pretty much it, except:
It would be interesting to do a benchmark test. |
Are you sending your BTW, you can just say SQL Sever (without the |
Well, that's really what I'd love to understand better about SQLServer! There doesn't seem to be an actual batch command in SQLServer that works with prepared statements. I'm really sending them individually. I tried concatenating a group of 1000 There should be a way that I can make one I also tried creating a giant prepared statement with separate variables for each column repeated 1000 times, but SQL*Server couldn't even compile that; just pegged the CPU. Didn't seem like a great idea anyway ;) so don't try that. |
Not with TSQL that I know of. I think that is what a stored proc is for.
LOL, so yea, maybe a stored proc is best here. I don't think there is anything in TinyTDS that would be at play here. However, if there were, I would like to know about it. If someone found the time to make a simple Rails app that could use an ENV var to demonstrate the issue between PG and SQL Server, that would be hellakewl. |
Maybe related, see this thread/post. #360 (comment) |
Encountered the same issue with slow inserts of 1000 rows at a time with multiple INSERTs. Will need to try the "sp_prepare" and "sp_execute" workarounds. Thanks for the tips! In case anyone looks for more tips... found this discussion on Stack Overflow Insert 2 million rows into SQL Server quickly. This one comment in particular looks promising https://stackoverflow.com/a/15837438/1633825 |
there would be an API by FreeTDS for "bulk copy" operations. if there is interest, we could check to expose this as an additional method in |
I am using Mac and ubuntu. I am using the 1.3.0 version of tiny_tds. I am using freetds v1.00.40.
I want to insert many rows in a single insert statement.
In MySQL, it's possible to just post an insert statement with many (..), (..) stanzas after the values keyword and it tends to figure things out pretty efficiently. SQL*Server supports this syntax as well, but it is very very slow.
I read about sp_prepare, and the Java JDBC driver for SQLServer seems to leverage that, judging by the source code. I tried crafting a giant insert prepared statement (1000 rows * 100 columns kept SQLServer's CPU pegged). I still can't figure out how to efficiently post multiple rows for the same insert statement.
Anybody have any pointers on how to do this via the tiny_tds library?
Thanks!
Eric
The text was updated successfully, but these errors were encountered: