Sql updating large number of rows


15-Nov-2018 07:19

Disabling After delete triggers will considerably increase the query performance. Replacing Update statement with a Bulk-Insert operation An update statement is a fully logged operation and thus it will certainly take considerable amount of time if millions of rows are to be updated.The fastest way to speed up the update query is to replace it with a bulk-insert operation.Combined with transaction-log backups, this method can also keep your transaction-log size to a minimum.If you don't have a mechanism for identifying the rows that have been updated, you can use a cursor to iterate through the data and commit every x values.The required indexes and constraint can be created on a new table as required.The code below shows how the update can be converted to a bulk-insert operation. The bulk-insert can then be further optimized to get additional performance boost. Regards Ahmad Osama Like us on Face Book | Follow us on Twitter | Join the fastest growing SQL Server group on Face Book Follow me on Twitter | Follow me on Face Book Ahmad Osama is a MCP Database Administrator/Developer.

More To become a truly data-driven enterprise, many business leaders recognize that they must extend the capabilities of self-service business intelligence (BI) and analytics to more of their business users.

The table doesn't include an incremental numeric or integer primary key.