The sniffing database

The sniffing database

Your SQL Server instances, like people with hay fever that forget to take their antihistamines during summer, is sniffing all the time. Sniffing is a trick employed by the optimizer in an attempt to give you better execution plans.

The most common form of sniffing is parameter sniffing. Many people know about parameter sniffing, but there are a lot of misconceptions about this subject. I have heard people describe parameter sniffing as a bad thing, and I know people who claim that parameter sniffing is mostly good with some exceptions that they then call “bad parameter sniffing”. Neither of these statements is true; in reality parameter sniffing (and the other forms of sniffing) are sometimes good, sometimes bad, and very often irrelevant. I will explain this in more detail in a later post – this post focuses on explaining what parameter sniffing actually is, and on what other forms of sniffing exist.

Yes, other forms of sniffing. Under the right conditions, SQL Server will also sniff variables and cardinalities. Most SQL Server developers and DBAs appear to be blissfully unaware of variable sniffing, and the few speakers and authors that do mention it tend to get it wrong. And cardinality sniffing is, as far as I know, completely undocumented. I have mentioned it a few times in some of my presentations, but never written about it – and I have never seen or heard anyone else describe this unique type of sniffing.

Parameter sniffing explained

To understand parameter sniffing, you have to know a bit about how SQL Server compiles queries into execution plans. When a query batch is submitted (either through an ad-hoc query tool such as Management Studio or the sqlcmd utility, or submitted from a client application through e.g. the ADO.Net library or JDBC), SQL Server will first try to avoid the (expensive) compilation process: it checks the plan cache to see if the same plan has been executed before and the plan is available. If that is not the case, then SQL Server will parse the entire batch, compile execution plans for each of the queries in the plan, store them in the plan cache. After that, all of the plans for the batch (either taken from the plan cache, or compiled, stored in the plan cache and then taken from it), are executed, in sequence or out of sequence as dictated by control-of-flow logic in the batch.

While compiling the query, the optimizer uses statistics about the data in the tables to estimate how many rows will satisfy any given condition. A condition such as “WHERE Age = 42” on its own is pretty broad. When you understand the data it operates on, your perception of the condition will change: in a database on men in a mid-life crisis, odds are that a rather high percentage of the rows will match; the same condition in the student database of a community college should generate at most a handful of hits. This reflects in the statistics that the optimizer uses, so the same query condition can result in different plans depending on the data distribution.

When the condition uses a variable (e.g. “WHERE Age = @Age”), then SQL Server cannot use the statistics in the same way. When the query is optimized, the optimizer knows the data type of the variable (because the parser has processed the DECLARE statement), but not the value, because it has not been assigned yet; the assignment occurs when the batch executes, after the optimization process. The optimize will still use some statistics, but not for a specific age; instead it looks at the number of distinct values used and assumes that the data is evenly distributed. So for a kindergarten database, the number of distinct values for Age would probably be three (5, 6, and 7), and the optimizer would assume 33% matching rows for any value of @Age passed in; for the US census database that same Age column would have over 100 distinct values, and the optimizer would estimate that less than 1% of the rows will match the condition for any value of @Age.

A parameter looks for most purposes exactly like a regular variable. The difference is that a parameter is declared in the header of a separately executable code unit: a stored procedure, scalar user-defined function, or multi-statement user-defined function. (And since you should as a rule not use the latter two, I’ll use a stored procedure for my example). The optimizer treats the body of a stored procedure like an ad-hoc batch: when the procedure is invoked the plan cache is first checked, and when no cached plan for the procedure is found it is generated and then stored for future reuse. The key difference is how parameters are treated. To see this in action, run the below script in the AdventureWorks2012 sample database (though it probably also works in other versions of AdventureWorks), with the option to show the actual execution plan enabled. It contains four batches, to create a sample stored procedure, invoke it twice, and then drop the procedure again. The discussion below will focus on the second and third batches, with the two EXEC statements.

CREATE PROC dbo.ParameterSniffingDemo

                @ProductID int

AS

SELECT  SUM(OrderQty)

FROM    Sales.SalesOrderDetail

WHERE   ProductID = @ProductID;

GO

— Run the procedure, then check the execution plan.

EXEC dbo.ParameterSniffingDemo @ProductID = 898;

GO

— Run the procedure again, for a different product.

EXEC dbo.ParameterSniffingDemo @ProductID = 897;

GO

— Clean up

DROP PROC dbo.ParameterSniffingDemo;

GO

The second batch in the query above, like any other batch, is first parsed and compiled. It is important to be aware that only the batch itself is compiled at this time. Once the compilation is done, SQL Server executes the EXEC statement: it sets the parameter value to 870 and then passes control to the stored procedure. Since the procedure was just created, there is no plan in cache yet, so at this time the compiler is once more invoked to generate an execution plan for the procedure. If you read this sequence of events carefully, you will realize that, unlike “normal” variables, the value of parameter @ProductID has been set before the compiler is invoked. The optimizer can read this value to use the specific statistics for ProductID 870 instead of the generic statistics it would use for a normal variable, to create an execution plan that is optimal for this “sniffed” value.

The third batch invokes the stored procedure again, with a different value passed into it. The batch itself is different from the second batch, so this batch, too, will be first compiled and then executed. And again control will pass to the stored procedure when execution starts, after setting the parameter to its value. But now the compiler will see that there already is an execution plan available for the stored procedure in the plan cache, so instead of invoking the (relatively expensive) optimization process again, it will reuse the existing plan.

image

The picture above shows the execution plan of the third batch (the second execution plan in the output, as the CREATE PROC and DROP PROC statements do not generate any actual execution plans). Evidence of the parameter sniffing process is present in the plan, but not directly visible in the graphical representation. To see the evidence, you will have to right-click the top left SELECT operator, and then click “Properties” from the context menu. This brings up the full list of properties, as shown below:

image

The “Parameter List” section is collapsed by default; I have already expanded it in the screenshot above. This is where you can see exactly what happened: there is one parameter in this query; it had a value of 898 (“Parameter Compiled Value”) when the plan was compiled and stored in cache, and during this specific execution the value of the parameter was 897 (“Parameter Runtime Value”).

Variable sniffing

Above I explain the general process of SQL Server compiling batches and entire stored procedures fully before execution starts; (non-parameter) variables do not have a value at compile time so the optimizer can only use more generic statistics. I left out a more specific part: statement level recompiles.

SQL Server may discard old execution plans and restart the compilation process for various reasons. These reasons fall generally in one of three categories: correctness (any change that might cause the existing plan to fail or to return incorrect data, like dropping an index that might be used in the plan, modifying a table, changing a constraint that might have enabled a plan simplification, etc); performance (any change that might enable SQL Server to find a much faster plan, like adding an index, rebuilding statistics, adding constraints, etc); or because you tell it to (by explicitly clearing the plan cache, or by using hints).

Some of the reasons for recompilation can occur while a multi-statement batch or stored procedure executes. SQL Server detects that before the next statement starts, and then that statement only will be recompiled. This leads to a rather unique situation: execution has started, so now the variables do have a value, which enables the optimizer to sniff that value just as it sniffs parameters. Except … well, it doesn’t. At least not always. In fact, there is only one very specific case where SQL Server will sniff the variables: when a statement-level recompile occurs due to an explicit hint.

Here is a very simple example, again using AdventureWorks2012 (and again, probably working in other versions of AdventureWorks as well):

DECLARE @Variable varchar(20);

SET @Variable = ‘B’;

SELECT  FirstName, LastName, Title

FROM    Person.Person

WHERE   LastName < @Variable

OPTION (RECOMPILE);

SELECT  FirstName, LastName, Title

FROM    Person.Person

WHERE   LastName < @Variable;

If you copy this query and, without first running it!, request an estimated execution plan, you will see the plan the optimizer creates when first compiling the batch:

image

While the batch is being compiled, the optimizer does not know the value of @Variable so it has to use a broad assumption – in the case of this type of inequality filter that assumption is that 30% of the rows will match, so the plan for the queries is based on an estimate of almost 6000 rows. The RECOMPILE hint on the second query does not affect the initial compilation of the batch; both queries get the exact same plan.

If you then actually execute the batch, with the option to include the actual execution plan enabled, you will see this:

image

The first plan has changed. The RECOMPILE hint forces SQL Server to recompile this query, and because this is a statement-level recompile, the SET statement has already executed. SQL Server can sniff the value ‘B’ in @Variable, use that to estimate that only about 900 rows will match, and then generate a plan that is optimized for that lower number of rows. You can play around with the value in @Variable to see that the plan used for the first query can actually change depending on this value.

The plan for the second query will never change. There is no RECOMPILE hint, and there were no other reasons for SQL Server to recompile this statement, so this query will always use the plan that was generated when the batch was compiled.

Fun fact: if you request an estimated execution plan after running the query, you will get the last plan that was actually used. That is because the statement-level recompile will update the plan cache entry for this batch with the new execution plan. The requested estimated execution plan will be presented to you from this plan cache entry. When you actually run the batch again, the initial compilation of the batch will also use the cached entry without redoing the full compilation process, but at run-time the RECOMPILE hint will still trigger a recompilation of the first query only – which will now result in the same plan again, unless you modified the data in the Person.Person table. This is a very important consideration. A lot of people think that adding OPTION (RECOMPILE) to a query results in the execution plan not being cached; this is not true. Those queries still result in the same memory footprint on the plan cache, with the additional cost of not only compiling the query whenever it runs, but also updating the plan in the cache on every execution.

Another interesting observation can be made when you right-click the SELECT operators on the two plans above and look at the properties. The second plan, which was not recompiled and sniffed, does have a Parameter List section in the plan, but in this case there is only a Parameter Runtime Value, no Parameter Compiled Value:

image

(The term “parameter” is confusing; within the context of execution plan properties SQL Server uses this term for every variable that is used in the query, regardless of whether they are actual parameters, or “normal”, non-parameter variables).

For the first plan, the properties of the SELECT operator do not include a Parameter List section at all, which is even more confusing – this was the section where we saw evidence of parameter sniffing, and now that a variable is sniffed there is no evidence at all! And how can SQL Server even produce the correct results without embedding the variable and its runtime value in the plan? To get an answer to that question, we’ll have to look at the properties of another operator, the Index Seek (NonClustered) at the far right. And in this case we do not even have to open the full property list, we can just hover our mouse over the operator and wait for the tooltip window to pop up:

image

This shows that the plan that is used for the statement with the OPTION (RECOMPILE) hint does not reference @Variable at all. What actually happened is that, during the statement level recompile, the parser inspected the current value of @Variable and then replaced all references to this variable in the query (or rather, in the internal representation of the query) by the value. The input that was finally received by the optimizer was exactly the same as if we had submitted the query below:

SELECT  FirstName, LastName, Title

FROM    Person.Person

WHERE   LastName < N’B’

OPTION (RECOMPILE);

For “normal” parameter sniffing, this would not be a safe implementation. The next call to the stored procedure could pass a different parameter value, and if the sniffed value were hardcoded in the plan, then the results would obviously be incorrect. That’s why a plan with parameter sniffing must keep the parameter; it optimizes for the sniffed value, but will also produce correct results for other values. In the case of variable sniffing, there is no need to keep the variable as a variable in the execution plan. A cached plan will only be reused if there is a full textual match on the full query text, which includes the OPTION (RECOMPILE) hint – and that option guarantees that the cached plan will be overwritten with a new plan rather than be reused. So the choice to implement variable sniffing differently, by directly injecting the values into the plan, is safe.

But that same safe choice also results in a loss of variable sniffing in other scenarios. When a statement-level recompile occurs for a different reason, for instance because enough rows were added to a table to trigger an automatic update of the statistics on that table, then no sniffing is done. This makes sense when you consider what would otherwise happen. Suppose I have a batch and the third statement uses a variable; just before it executes the threshold for automatic statistics update is hit so new statistics pop up and the optimizer recompiles the query. If it would sniff the variable, it would hard-code the sniffed value in the plan, then replace the old cached plan with the new plan. Next time I execute the same batch, the variable can have a new value – but the text of the batch has not changed, and it is unlikely that there will be a statement-level recompile for another reason. So the updated plan from the cache, that now includes a hard-coded value, would be used – but that value is no longer correct and wrong results would be returned. That is of course not acceptable, and for that reason SQL Server will only sniff variables when a statement-level recompile is forced by the OPTION (RECOMPILE) query hint.

In theory, it would have been possible for Microsoft to implement a second form of variable sniffing, using the same method as for parameter sniffing. In reality, that is not the choice Microsoft made. In order to ensure that a statement-level recompile that is not caused by OPTION (RECOMPILE) produces a “safe” plan, the variables are simply not sniffed. The new plan will be based on the new statistics, but it will still use the generic estimates.

Cardinality sniffing

Table variables behave in many ways exactly the same as normal variables. This includes the optimization process. So when a batch or stored procedure is submitted, the parser interprets the declaration of the table variable and then compilation starts without any data being in the table variable. Based on actual reality at the time of compilation, one might expect an estimate of zero rows. However, the optimizer always assumes at least one row (unless the query includes a filter that cannot possibly be true). If you run the query below, you will see that the first two SELECT queries both have an estimated rowcount of 1, even though the actual rowcount is 0 for the first query and 19972 for the second.

DECLARE @TabVar table

       (PersonID int NOT NULL PRIMARY KEY);

SELECT  PersonID

FROM    @TabVar;

INSERT  @TabVar

SELECT  BusinessEntityID

FROM    Person.Person;

SELECT  PersonID

FROM    @TabVar;

SELECT  PersonID

FROM    @TabVar

OPTION (RECOMPILE);

In the case of the simple queries above, this huge mistake in cardinality estimation does not affect the execution plan. But in more complex queries this can often result in very slow running queries. This is one of many reasons why most people tend to prefer temporary tables over table variables unless they know for sure that there will never be more than a handful of rows involved. This problem can be avoided by adding OPTION (RECOMPILE), as shown in the last SELECT above. This forces a statement-level recompile and now the actual number of rows in @TabVar can be used by the optimizer. This can often help prevent execution plans with terrible performance, but at the price of a full compilation for the statement every time it executes.

All of the above is pretty well known and described at many places; I only include this basic information because it is important to understand what “cardinality sniffing” is. Cardinality sniffing is related to parameter sniffing, because it only relates to parameters passed into stored procedures (and other executable code modules). It is also related to table variables. In fact, cardinality sniffing is specifically related to table-valued parameters – table variables passed as a parameter into a stored procedure. Here is an example:

— Define a type for the table variables

CREATE TYPE TabType AS TABLE

       (PersonID int NOT NULL PRIMARY KEY);

GO

— Create a stored procedure that uses the table variable

CREATE PROC dbo.SniffCardinality

       @InputTable TabType READONLY

AS

SELECT  PersonID

FROM    @InputTable;

GO

— Create and populate a table variable

DECLARE @TabVar AS TabType;

INSERT  @TabVar

SELECT  BusinessEntityID

FROM    Person.Person;

— Invoke the stored procedure the first time

EXEC dbo.SniffCardinality @TabVar;

— Now remove some of the rows from the table variable

DELETE  @TabVar

WHERE   PersonID < 20000;

— Invoke the stored procedure again

EXEC dbo.SniffCardinality @TabVar;

GO

— Clean up

DROP PROC dbo.SniffCardinality;

DROP TYPE TabType;

GO

If you run the batch above with the option to include the actual execution plan enabled, you will see a bunch of plans for all the queries in the last batch. The second and fourth are the executions of the stored procedure. Previously when we ran the same SELECT statement in a batch, the estimated number of rows for the table variable was 1. Now the estimates are different, as shown in the picture below (note that I edited the picture to remove irrelevant execution plans and include two tooltip windows):

image

As you see, the first estimate is exactly correct. That is because the table-valued parameter is used by the optimizer just as it uses other parameters. So when the stored procedure is first executed and no plan for it is in cache yet, it will sniff the table-valued parameter just as it sniffs other parameters. Now it is important to realize that this sniffing is limited. The optimizer will not read data from the table variable to get insight on the data in it. It only sniffs the metadata that is available at the time of compiling the plan. And because table variables do not have statistics, that metadata is limited to only the number of rows in the table variable that is passed into the stored procedure for the first execution.

The second estimate is wrong. After the first call to the stored procedure we deleted most of the rows from the table variable, but the execution plans that are used for the stored procedure are all still optimized on the original row count. That is because the execution simply reused the previously compiled plan from the plan cache. A change in the cardinality on the table-valued parameter, like a change to the value of a regular parameter, is no reason to recompile the execution plan for the stored procedure.

I already mentioned that cardinality sniffing has, to the best of my knowledge, never been documented before. That is not really surprising. I have yet to see my first table-valued parameter in “real” client code, and I hear almost nobody ever talk about them, so the feature appears to be pretty niche. However, when you happen to work on a system that does use table-valued parameters, then you might be faced with cardinality sniffing and its possible performance consequences. (And if not, then you can at least add “Hey, would you like to sniff my cardinality?” to your collection of useless pickup lines).

Conclusion

In this post, I first elaborated a bit on the relatively well-known concept of parameter sniffing, giving some background information into the internals at play in this process. I then went on to explain the much lesser known concept of variable sniffing, and proved that (unlike popular opinion by those who do talk or write about this concept) it actually uses quite different internal mechanisms from parameter sniffing, and that for this reason variable sniffing only happens when a recompile is manually enforced.

In the last paragraph I then introduces a previously unknown form of sniffing: cardinality sniffing. In a process similar to parameter sniffing, the number of rows in a table variable can be sniffed when it is passed as a table-valued parameter into a stored procedure.

In the opening paragraph I already hinted that parameter sniffing, though well-known, is also very misunderstood. It is neither “bad” (as suggested by many experts), nor “mostly good with an occasional exception” (as suggested by most others), but actually “sometimes good, sometimes bad, and mostly irrelevant”. I will explain this in my next blog post on this subject.

Upcoming presentations. And an offer.
Misconceptions on parameter sniffing

Related Posts

No results found.

1 Comment. Leave new

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close