Temperatures: Easy to measure?

Author: Rolf Suessbrich, Dortmund, Germany, 13/02/2005, Version 1.14

Everybody active in darkroom work has one tool from which it is believed to show a physical property quite correct: a thermometer! And is also known that precise temperature controlling is one of the basics for reproducible film development . A deviation of 1K (Temperature differences unit is K, while absolute temperatures are given either in C with 0 as freezing point of water, or K, which is C + 273.15) is a significant deviation and must be compensated in development time, otherwise we will not achieve a reproducible development.

But let's look into real life:

A bunch of thermometers is exposed to the same environment, so they should all show the same temperature, but do they?

thermometers


What kind of thermometers do we have here?
A is a relatively new cheap lab-thermometer from –10C to 100C.
B is a about 10 years old and also a cheap lab-thermometer from –10C to 100C
C and D are expensive precision thermometers for colour development processes
E was sold as tank thermometer for film processing
F is a Srewpulll wine thermometer (similar design as E)
G is a lab thermometer from –15C to 150C

All thermometers except D and G are filled with alcohol. D and G are mercury filled.

A, E and F are made according to the new style, where the capillary is just a tube in the body, which is fully of glass. The scale is printed outside onto the glass surface.

So what do we see?
 
A
B
C
D
E
F
G
24.0C
23.8C
26.6C
25.5C
24.5C
25.0C
25.0C

Now what is the correct value? The answer is: we can't decide! Most astonishing is, that the two precision thermometers are showing the highest temperatures and that the displayed temperature difference between them is more than 1K.

We therefore have the first conclusion:


The absolute accuracy of usual thermometers is very poor !


When we see above values the risk of this poor accuracy is the following:

Let's assume a development time for a film is 20min@20C (values like this are easily reached e.g. in D76 1+3). We reduce the development time by 10%/1K, and the resulting development time according to the temperature measured would be:
 
A
B
C
D
E
F
G
12.0min
12.5min
6.8min
9min
11min
10min
10min

We have a gap between 6.8min to 12.5min, which is close to 100%(!), and everybody, who has experience with film development knows that development results for the same film which vary between the times shown will be definitely NOT the same: either with 6.8min the same film will be extremely underdeveloped or with 12.5 min (if the shorter time is OK) the film will be overdeveloped.

After discussing the problems with the lack of absolute accuracy we also have to look at the relative accuracy: How precise can we determine temperature differences?

See the values, which have been determined by starting in some chilled water. Temperature was raised by adding two batches of warm water:

A
B
C
D
E
F
G
Thermometer
11.5
-
14.8
13.5
13.0
14.0
13.0
1st temp T1
16.2
-
19.0
17.8
17.0
18.0
17.2
2nd temp T2
28.0
-
31.0
29.6
28.5
29.0
29.0
3rd temp T3

was






16.5
broken
16.2
16.1
15.5
15.0
16.0
T3 - T1


Again we see poor uniformity! The last line shows the difference between the lowest and the highest temperature. A shows 10% more than F and G.

How can we live with such uncertainties? Easy! I've declared the precision mercury thermometer as the standard thermometer, and the alcohol precision thermometer is my work thermometer. So the potential danger of the mercury is minimized (looks that vapours are poisonous, and vapours may occur when the thermometer gets broken ), because this thermometer rests in a save wooden box in a save place. I've glued a small table at an easy reachable location giving me the read values of C and the 'real' values of D. As we can see from the second table the relative accuracy of both precision thermometers is nearly the same (16.2 and 16.1), so they are "precise" in terms of relative accuracy, but badly adjusted!

Here is my way to achieve reproducible development conditions:

WORK THERMOMETER
Due to my table I know that when I want to have water of 19.2C the scale must show 20,6C. If my work thermometer will get broken, I'll buy a knew one and will standardize it with my 'standard' thermometer. Give a thermometer filled with alcohol some time, at least 1 minute, until you believe the temperature shown. Also take in account that standard thermometers must be fully submerged to show exact temperatures (for me this means that the thermometer must be dipped so deep into the liquid that the length of the alcohol column is under the liquid level). As a test: Have two glasses of water available, one filled with cold tab water, the other with warm tab water. Put the thermometer into the cold water, stir a little bit and wait until the temperature shown does not change any more. You will see, it will take some time. Now put the thermometer into the warm water. Temperature will raise, but see how slowly the upper stable value is reached, even when you stir the thermometer. It might take up to a minute until the temperature shown does not change any more. However: When you see the alcohol column reaching the upper limit of the scale immediately remove the thermometer from the warm water, otherwise it will burst! (In this case we don't have a common understanding what is warm and what is hot!)

A table work thermometer -> standard temp looks like:

Tread from work thermo 19.1C
20.1C
21.1C
22.1C
23.1C
equals Treal (accord. to stand. thermo)
18.0C
19.0C
20.0C
21.0C
22.0C

(in this case Tread  from work thermo is general 1.1C to high, but that's only due to the same linear increase). So when I want to have 20C, I mix to read 21.1C!  Luckily I just have to add or subtract a constant value.

In case we observe a linear behaviour of the deviation the table might look different and we might have to plot a line.

See the following charts:

The left chart shows the temperature readable on each thermometer (except B) versus the temperature shown on the reference thermometer D. It is clearly visible that G, and C (and nearly A) have the same slope as D, so the difference is just a constant value for all temperatures. A, E, and F (the 'new' way to fabricate thermometers) show different slope, which make the correction of the value read more complex. This can be seen an the right chart: When we want to adjust the temperature of something to e.g. 21C, we have to read from the work thermometer F19.7C. On the other side: when we read from this thermometer 23C it's in reality 24C.

Kennlinie T


STANDARDIZED DEVELOPMENT TIMES
My development times are adjusted in terms of film/developer/dilution/agitation to get an optimum contrast in the negatives for my enlarger and I measure temperature with my work thermometer.
 
DEVELOPMENT AT 20C
Due to a very efficient heat insulated development tank (will be described in future on my pages) start is always with 20C and end after 15min is with a maximum deviation of 0,5C from starting temperature.

These prerequisites allow to control the negative development process completely and precisely.

And electronic thermometers? More accurate? No, not at all . Most electronic thermometers are specified to show temperatures with +/- 1C and a resolution of 0,1K. We have here the same as with the precision thermometers above: the relative accuracy is OK (resolution of 0,1K), while the absolute accuracy is not (+/- 1C)!

Why do we have the problems?

Temperature is a physical entity which we can't see (seeing is our sharpest sense) but only feel, which is very imprecise. The measurement is done by watching changes of physical properties which react on temperature changes. With traditional thermometers it is the thermal expansion of mercury or alcohol. The amount of liquid stored in the bulb expands very constant proportional to temperature, and the additional volume required is found in the capillary of the thermometer. That's why the column grows with rising temperature, and that's why a thermometer bursts when the liquid column reaches the upper end. No space left for expansion crackles everything, because liquids and solids are not compressible (compared with gaseous substances) and thermal expansion can raise gigantic forces.

Now it is possible to produce very precise capillary tubes for the column of the thermometer, but the volume for the bulb MUST exactly match a given value otherwise the thermal expansion in absolute values is too large or too small. If the required absolute value is not exactly hit, a volume too large will lead to 'nervous' instruments overreacting to temperature changes, while the contrary is leading to 'lazy' reactions to environment changes. All this seems to be controllable! But as we can see from the thermometer comparison above, it is difficult.

Now with electronic thermometers we look usually at resistance changes of a material according to temperature, or there is something like a law of nature that a voltage drops by -2mV/K of an internal layer of an integrated circuit (pn-area). This means that this device has to handle a voltage difference of 0,2mV/0.1K , but 200µV is a very small voltage and difficult to handle. Again, the adjustment to absolute values is the problem, the relative accuracy is usually not the problem.

In general temperature is a physical property not very easy to be measured very accurate. However, we can believe that we can measure temperature quite reproducible for our own process. But when we tell a friend to develop a film at this and that temperature with that developer (-dilution) and this agitation he very likely will fail to get comparable results, because his thermometer very likely will show different temperatures to ours.










1