Friday, January 12, 2018

Querying the Query Store

SQL Server 2016 has finally included Query Store functionality where SQL Server efficiently records query/plan performance activity. Operational DBAs really need this information to see which queries are taking the bulk of the resources, and which queries have degraded on a plan flip. Before Query Store, we needed to use SQL Profiler to sample and aggregate. Query Store is much easier and is so efficient it can be left turned on in most environments.
SQL Server provides some visualisations of the top queries etc., but it doesn't give me the detailed information I need. But that's no problem, as we can write our own queries on the Query Store tables.
Here is my base query, which will give a list of queries and their plans, ordered by the queries with the greatest total CPU consumption. It's easy to change minimum thresholds, date range etc.
Note, if you haven't enabled Query Store, here is a link for turning it on. It's very easy. https://docs.microsoft.com/en-us/sql/relational-databases/performance/monitoring-performance-by-using-the-query-store#Enabling
Here is my base Query Store query 
declare
        @DaysBack int = 3
       ,@TotalCPUSecondsThreshold int =600;
with qTot as
       (select q.query_id, sum(rs.avg_cpu_time*rs.count_executions)/1000000 TotalCPUSeconds
              from sys.query_store_plan AS
              inner join sys.query_store_query AS
                     ON p.query_id = q.query_id 
              inner join sys.query_store_query_text AS qt 
                     ON q.query_text_id = qt.query_text_id
              inner join sys.query_store_runtime_stats rs
                     on rs.plan_id=p.plan_id
              inner join sys.query_store_runtime_stats_interval i
                     on i.runtime_stats_interval_id=rs.runtime_stats_interval_id
              where i.end_time>getdate()-@DaysBack
              group by q.query_id)
select
        q.query_id
       ,qt.query_sql_text
       , p.plan_id
       ,p.query_plan
       ,q.object_id
      ,convert(datetime,min(DATEADD(hh,DATEDIFF(hh,GETUTCDATE(),GETDATE()),rs.first_execution_time))) FirstExecutionTime
       ,convert(datetime,max(DATEADD(hh,DATEDIFF(hh,GETUTCDATE(),GETDATE()),rs.last_execution_time))) LastExecutionTime
       ,sum(rs.count_executions) Executions
       ,max(rs.max_dop) Max_DOP
       ,sum(rs.avg_cpu_time*rs.count_executions)/1000000 TotalCPUSeconds
       ,sum(rs.avg_duration*rs.count_executions)/1000000 TotalDurationSeconds
       ,100*sum(rs.avg_cpu_time*rs.count_executions)/sum(nullif(rs.avg_duration,0)*rs.count_executions) AvgCPUSaturationPct
       ,sum(rs.avg_cpu_time*rs.count_executions)/sum(rs.count_executions)/1000000 AvgCPUSeconds
       ,sum(rs.avg_duration*rs.count_executions)/sum(rs.count_executions)/1000000 AvgDurationSeconds 
       ,sum(rs.avg_physical_io_reads*rs.count_executions)/sum(rs.count_executions) AvgPhysicalReads 
       ,avg(rs.avg_rowcount) AvgRowcount
       ,qTot.TotalCPUSeconds TotalQueryCPU
from sys.query_store_plan p 
inner join sys.query_store_query q 
    ON p.query_id = q.query_id 
inner join sys.query_store_query_text qt 
    ON q.query_text_id = qt.query_text_id
inner join sys.query_store_runtime_stats rs
       on rs.plan_id=p.plan_id
inner join qTot
       on qtot.query_id=q.query_id
inner join sys.query_store_runtime_stats_interval i
       on i.runtime_stats_interval_id=rs.runtime_stats_interval_id
where i.end_time>getdate()-@DaysBack
  and qTot.TotalCPUSeconds > @TotalCPUSecondsThreshold
group by
        q.query_id
       ,qt.query_sql_text
       ,p.plan_id
       ,p.query_plan
       ,q.object_id
       ,qTot.TotalCPUSeconds
order by
        qTot.TotalCPUSeconds desc
       ,q.query_id

       ,p.plan_id

Hope that helps you tune your high performing SQL databases.

No comments: