I got a large conversion job- 299Gb of JPEG images, already in the database, into thumbnail equivalents for reporting and bandwidth purposes.
I've written a thread safe SQLCLR function to do the business of re-sampling the images, lovely job.
This allows you to gauge performance, manage scaling, allow stops and restarts without having to start over, and gives you something to show how complete the task is (let alone show that it's actually doing anything). tip=1047 cheers Could you not split the query into batches, and execute each batch separately on a separate connection?
I too recommend the "round-robin" methodology advocated by kragen2uk and onupdatecascade (I'm voting them up).The cardinality of the column (the selectivity) determines whether an indexes will be faster than a parallel funn table scan.************************************************ Second, your hint syntax is incorrect and being ignored! You degree should be (cpu_count - 1) - What is parallel_index? Parallelism only works with full scan operations, see my notes here.Please note the session is enabled for parallel dml.Find below one pseudo code of that update statement.I have to update all records (add Guids) on two (indexed) empty columns of 150 tables, each table with around 50k records (using a script to create 40k updates at once in c# and post it to the server) and exactly 4 existing columns.On my local machine (16GB RAM, 500GB Samsung 850, SQL Server 2014, core i5) when I try to run 10 tables in parallel it takes a total of 13 minutes, while if I run 5 the process finishes in mere 1.7 minutes.Your execution statistics also show high disk reads, typical for a full scan operation.Finally, check for any redo issues during this update (log switches, redo log space waits).One thing thats worth noting is that it will only decide whether or not to use parallelism at the time that the query is compiled.Also, if the query is compiled at a time when the CPU load is higher, SQL server is less likely to consider parallelism.