Stonkie
Well-known member
- Joined
- Sep 12, 2007
- Messages
- 279
- Programming Experience
- 1-3
I have 86193 INSERT queries to run on an SQL Server Express database from my application. The machine is the customer's so adding hardware is not an option.
Now this :
Simply throws an exception because the server doesn't have the memory to process the execution plan...
Then I tried sending the queries one by one, but it takes over 30 mins to run wihtout the transaction... I can only expect worse performance with a transaction (but I thought I'd make a database backup before the execution instead). The query itself is just a plain list of INSERT statements with values (maybe I could win by parametrizing the actual queries, but I don't know how to do that in a batch script). There is one value that actually is IDENT_CURRENT( 'MainTable' ). I tried changing between this and a MAX(ID) FROM MainTable and the IDENT_CURRENT seems to give the best performance by a long shot.
Anyone has an idea how I can optimize this? Temporarily turn off some indexing or constraint? Maybe I should use an SqlCommand object rather than this "Microsoft.SqlServer.Management.Smo.Server" object? Strangely, I'm getting better performace uploading 1.30 GB of encrypted pictures with a TableAdapter than running a 20 MB script...
The database is not replicated or anything. It's actually a very plain computer I use for testing purpose...
EDIT : I just got it under 15 mins by removing the line that displayed the currently processed query's index on the console (and a slight tweak to the queries). I still hope I can make it faster, but that's acceptable at this point...
Now this :
VB.NET:
conn as new SqlConnection(connectionString)
server as new Server(new ServerConnection(conn))
server.ConnectionContext.ExecuteNonQuery(script)
Simply throws an exception because the server doesn't have the memory to process the execution plan...
Then I tried sending the queries one by one, but it takes over 30 mins to run wihtout the transaction... I can only expect worse performance with a transaction (but I thought I'd make a database backup before the execution instead). The query itself is just a plain list of INSERT statements with values (maybe I could win by parametrizing the actual queries, but I don't know how to do that in a batch script). There is one value that actually is IDENT_CURRENT( 'MainTable' ). I tried changing between this and a MAX(ID) FROM MainTable and the IDENT_CURRENT seems to give the best performance by a long shot.
Anyone has an idea how I can optimize this? Temporarily turn off some indexing or constraint? Maybe I should use an SqlCommand object rather than this "Microsoft.SqlServer.Management.Smo.Server" object? Strangely, I'm getting better performace uploading 1.30 GB of encrypted pictures with a TableAdapter than running a 20 MB script...
The database is not replicated or anything. It's actually a very plain computer I use for testing purpose...
EDIT : I just got it under 15 mins by removing the line that displayed the currently processed query's index on the console (and a slight tweak to the queries). I still hope I can make it faster, but that's acceptable at this point...
Last edited: