What is the thumb of rule you would recommend for tables larger than 1TB.
I totally understand that it depends on how the data is changes in the table.
I was leaning more towards 10% size change. But for table greater than 1TB, is that ok?
Because the full collect stats runs really long. My main concern is running stats during the etl window which needs resources.
And for updates there is no really good way to find 10% .
So what do you recommend other than the sample stats explain above.
We currently have td 13.10.
Hi Carrie,
What is the thumb of rule you would recommend for tables larger than 1TB.
I totally understand that it depends on how the data is changes in the table.
I was leaning more towards 10% size change. But for table greater than 1TB, is that ok?
Because the full collect stats runs really long. My main concern is running stats during the etl window which needs resources.
And for updates there is no really good way to find 10% .
So what do you recommend other than the sample stats explain above.
We currently have td 13.10.
Thanks
ppg.