Sql updating large number of rows
The required indexes and constraint can be created on a new table as required.The code below shows how the update can be converted to a bulk-insert operation. The bulk-insert can then be further optimized to get additional performance boost. Regards Ahmad Osama Like us on Face Book | Follow us on Twitter | Join the fastest growing SQL Server group on Face Book Follow me on Twitter | Follow me on Face Book OAuth Exception: (#10) To use 'Page Public Content Access', your use of this endpoint must be reviewed and approved by Facebook.In addition to this, it might also cause blocking issues.Here are few tips to SQL Server Optimizing the updates on large data volumes. Let’s look at the execution plan of the query shown below.If you have people modifying data while you perform this operation, you will have to perform a true-up operation with the schema switch. select t.*, int_field = CAST(-1 as int) into mytable_new from mytable t -- create your indexes and constraints GO exec sp_rename mytable, mytable_old exec sp_rename mytable_new, mytable drop table mytable_old What I'd try first is to drop all constraints, indexes, triggers and full text indexes first before you update.I recommend creating a trigger on to log all DML to a separate table. Then in the same transaction that you perform the schema transfer, use the log table to perform a true-up. If above wasn't performant enough, my next move would be to create a CSV file with 12 million records and bulk import it using bcp.Then, recreate all non-clustered primary keys/unique constraints/indexes and foreign key constraints (in that order).
https://docs.microsoft.com/en-us/sql/t-sql/statements/set-rowcount-transact-sql? view=sql-server-2017 CREATE FUNCTION tvf Select Latest Row Of My Table Matching Criteria ( @Param1 INT, @Param2 INT, @Param3 INT ) RETURNS TABLE AS RETURN ( SELECT TOP(1) My Table.* FROM My Table JOIN My Other Table ON ... Thus, an update query runs faster if the column to be updated is not an index key column.The index can always be created once the update completes. Executing the update in smaller batches The query can be further optimized by executing it in smaller batches. The code below updates the records in batches of 20000. Disabling Delete triggers Triggers with cursors can extremely slow down the performance of a delete query.CREATE PROCEDURE Update Tables AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements.SET NOCOUNT ON; UPDATE Table1 Set Column = 0 WHERE Column IS NULL UPDATE Table2 Set Column = 0 WHERE Column IS NULL UPDATE Table3 Set Column = 0 WHERE Column IS NULL UPDATE Table4 Set Column = 0 WHERE Column IS NULL END CREATE PROCEDURE Update Tables AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements.