I run my offset in a temp range, measured by the analog gauge at the stack end of the cook chamber. It stays within 275 to 300 * .
But I also put a digital probe right next to the stem of the analog gauge, I use a probe tree to put it within one inch.
The temps are never the same. The digital gauge always runs hotter, up to 20 degrees. So what temp am I actually cooking at ?
The reason they don't agree, is the digital gauge reacts faster to temp changes. And in a backyard offset, the temp is almost always changing as a new split catches and slowly burns down.
And the analog gauge is always behind. As temps rise, the digital races out ahead. But when temps drop, the gap between the two narrows.
But the meat in the cooker also does not react as fast as the digital gauge. If you open your fridge door, your milk does not immediately drop in temp, even though the cooler air is leaving the fridge.
So, is an average of the digital temp more accurate ?
The diff between the two gauges can be the diff between what some call low/slow and hot/fast.
But I also put a digital probe right next to the stem of the analog gauge, I use a probe tree to put it within one inch.
The temps are never the same. The digital gauge always runs hotter, up to 20 degrees. So what temp am I actually cooking at ?
The reason they don't agree, is the digital gauge reacts faster to temp changes. And in a backyard offset, the temp is almost always changing as a new split catches and slowly burns down.
And the analog gauge is always behind. As temps rise, the digital races out ahead. But when temps drop, the gap between the two narrows.
But the meat in the cooker also does not react as fast as the digital gauge. If you open your fridge door, your milk does not immediately drop in temp, even though the cooler air is leaving the fridge.
So, is an average of the digital temp more accurate ?
The diff between the two gauges can be the diff between what some call low/slow and hot/fast.