I want to know how many milliseconds of CPU utilization happens for a particular database per 5-minute period. Please correct me if I’m wrong, but the Query Store looks like a great way to find that out. I wrote a query that pulls values from query_store_runtime_stats, and it seems to work fine. Where I need help is understanding how to configure the Query Store appropriately, to make it store what I need, but no more than that. I don’t need execution plans, just CPU usage. I might check every hour or so to gather the values, so I’d want to make sure that data stays around for at least a few hours. Cleanup will have to happen automatically, and of course I can’t allow it to switch to read-only mode if too much data gets stored. What changes should I make to the following, for SQL Server 2019, to best meet my needs:
ALTER DATABASE CURRENT SET QUERY_STORE = ON ( OPERATION_MODE = READ_WRITE, CLEANUP_POLICY = ( STALE_QUERY_THRESHOLD_DAYS = 90 ), DATA_FLUSH_INTERVAL_SECONDS = 900, MAX_STORAGE_SIZE_MB = 1000, INTERVAL_LENGTH_MINUTES = 60, SIZE_BASED_CLEANUP_MODE = AUTO, MAX_PLANS_PER_QUERY = 200, WAIT_STATS_CAPTURE_MODE = 1, QUERY_CAPTURE_MODE = CUSTOM, QUERY_CAPTURE_POLICY = ( STALE_CAPTURE_POLICY_THRESHOLD = 24 HOURS, EXECUTION_COUNT = 30, TOTAL_COMPILE_CPU_TIME_MS = 1000, TOTAL_EXECUTION_CPU_TIME_MS = 100 ) );
Thank you in advance.
sys.query_store_runtime_stats are directly tied to the sys.query_store_plan via the
Not collecting the plan would also mean not collecting the runtime stats. You can limit the
MAX_PLANS_PER_QUERY = 1, but you would be missing the data.
Query Store might not be the ideal tool for your use case. Consider a cross-DB query. If a query originates from one database but does majority of the work in a different one the CPU information is not where you would expect it.
To bucketize the CPU by database, I would probably use Extended Events along with some framework for easy consumption.
For example WorkloadTools or XESmartTarget – both by Gianluca Sartori