OK, thanks guy for your thoughts. To answer your questions. My standard is a laboratory mercury-in-glass with a recent calibration certificate. On a heavily overcast day, the difference between it and the sensor is generally better than 0.3 deg C, which ain’t bad. As the bulb of it is reflective I assume it is relatively free of radiation errors. On a bright, sunny day, when the thermometer is placed in the shadow of my radiation shield on a windless day, there is a 4 deg C difference. Poking the bulb through the louvres of the shield and I get essentially similar readings to the sensor, proving the shield is not 100% effective.
The shield is a stacked plantpot saucer type in semi-glossy white-pigmented polyester co-polymer (I suspect from recycled polyethylene terephthalate). I have tried it with the bottom open and closed: the difference is difficult to quantify, but it is not enormous, probably less than 0.5 deg C. Unfortunately, I cannot place it anywhere but in a shadeless south-facing garden. As yet, I have not tried forced ventilation. I am considering trying placing an aluminium foil-covered sheet round half of it so that the whole shield is never in direct sunlight, even though it will restrict wind access; this could be oversome by forced ventilation. Even more tempting is the aluminium-foil covered horizontal tube, as described in one of the above references. This would have the advantage of being relatively immune to albedo errors, which are not negligible.
To get an idea of albedo errors, before I “permanently” installed the sensor, I did a number of unscientific, qualitative, tests. I fixed it to a 1 m pole, with no radiation shield. Over several calm sunny days, after the sun went behind a nearby hill in the afternoon (~15:30-16:00) I stuck the pole in different substrates and allowed the readings to settle. My “standard” position was over a pale sandstone crazy paving and I took a reading there about 30 minutes after the sun disappeared, moved it to over a different substrate, took a reading of temp and humidity and then moved it back again. I averaged the start and finish temps and RHs and assumed that, as the drop in temp was reasonably linear with time, this would be the condition at the time of measuring over the test substrate. I obtained the following results:
Over Bermuda grass, cut short: T = -2, RH = +8
Over Kikuyu grass, cut to ~75 mm: T = -3, RH = +10
Over cultivated soil: T = 0, RH = +9
Over packed soil: T = 0, RH = +5
Over red-dyed concrete: T = -0.5, RH = -5
There was no irrigation or rainfall during this period. I’d guestimate the margin of error to be T +/- 1, RH +/- 5. T in deg C, RH in %. Note these figures are always the delta-T and delta-RH, so the absolute values were probably not material. I concluded from this, that it is an error to place a hygrometer over grass or soil and a thermometer over a hard surface! As the sensor was combined, I had to compromise, so I chose the red concrete, which made for a more convenient installation.
This is my current thinking.