[UPDATED with new solar sensor] Solar in a jar question

Ah yeah i see what you mean, it looks like i will have to experiment with the experiment lol
I dont use the pumps for the dents, i use steel rods behind the panel :slight_smile:

Do you think the vacuum will make a big difference or will it be minimal, why would the vacuum
even have a slight difference, i mean whats the difference between air and no air, apart from not
breathing lol

Removing the air would just reduce the amount of heat the thing will store, so the sensor would heat up and cool down just a lttle bit faster. I doubt it’s really significant.

Ah ok, i will have a go though i think and see what differences if any.
I have a high powered torch, 1 million candle power so i will use that as the sun to test it
At least then i will be able to make a stand for the torch and create consistent tests :slight_smile:

Still not got myself another sensor yet :frowning:

But as for the one im using i think it might be too small, when its totally
overcast i get 0% solar all the time, i would have thought there would
have been a slight reading, but this morning its overcast and the solar
has been 0% all the time, not even the slightest reading and its not 10AM

Should i be getting a slight reading when overcast?

If so then i think i need to go bigger than a light bulb, more so the bulb that
i am using, as it is smaller than your average bulb thats for sure…

I think i will have to either go back to the original flask but with no black in there
or go for a smaller flask and try with no black and if thats not much better then
try with the black…

Sounds like a reasonable idea. If there are only 2 degrees C or so between dark and full sun, maybe you have invented something that is just too sensitive, i.e. it is operating more like a DARK/SUNNY switch than something which provides a range of values.

My small Co-op olive jar seems to give a useful range of 0.1C to 6C.

Thought: What’s your ‘Low’ figure? Perhaps it would be worth playing with that a little?

Hi David

My low figure is default 1.1 and high is 2.5, it took me ages to get that High figure
It was a sunny day and my old sensor was at 13.7 i think it was for 100% so it took
a while to slowly set this one up, i couldnt beleive it was actually as low as 2.5, so i
even did again to make sure, sure enough it was right lol

What i have done now is, added an offset to the extra temp sensor i think its set
at about 65 and this has allowed me to set the high figure to 6.2 now, it seems
to be working ok but only time will tell, I am hoping this will still give me a reading
even when overcast as before it wasnt, it was on 0% all the time… Also i am hoping
that when night time comes it will drop to 0% but will see later on lol

The best part is, its a lot more sensitive and thats the reason for going smaller :slight_smile:

My guess is that the root of the 0% problem was the ‘low’ figure of 1.1.

(This is perhaps obvious, but) by having low of 1.1 C you are telling WD that until the ‘jar’ is 1.1 C warmer than the standard outside temp in your radiation shield then it’s ‘dark’. The original ‘High’ of 2.5 C is then telling WD that only 1.4 C more difference than the ‘low’ figure shifts to 100% sunshine!

Unless you need the 1.1 C ‘calibrate’ your ‘jar’ probe to the standard outside temp I suspect that a lower figure might be better.

How do the two temperatures compare at night (i.e. when it’s dark)? That might be a way of establishing a value for ‘low’ (but maybe you’ve already done that, or maybe that’s what the offset is effectively doing now?).

Am I making any sense?

Nah not much sense to me, but it tales a lot to sink in lol

My night time comparison as near on spot on with both the solar and the temp in the screen
I didnt think you could go much lower than 1.1 as it says do not use 0, set at 1.1 and 7.1 for
a starting point… oh well, typical of me misunderstanding as usual lol

So when it really is ‘dark’ (i.e. 0% solar) the temp difference is just about zero. In that case my logic is to have the ‘low’ figure set to just about zero too?

With 1 decimal place available, you can use something as small as 0.1 if you wish and still avoid using zero. :slight_smile:

Ah ok im with you now… Let me see what the temp is again tonight when its dark
then will go from there, although with the offset, it does look pretty good at the
minute, but again that could change lol

Thanks for your advice!!

Turns out My outdoor temp (in Stevenson Screen) is 14.4C and solar 15.5C at night time
also my UV probe is 15.2C So i am now wondering if my outdoor temp is out by about -1C
failing that the 2 probes are out by +1C lol or its a bit warmer 20ft up on the mast?

Hi Simon

I would ignore what the UV temperature sensor is indicating { due to the modifications being made its not actually reading temperature}

your other two temperature sensors, I would check them against a standard thermometer, monitor them over a couple of days
check them, if you can at the same time

Have you tried assembling the solar sensor in the laboratory flask, without using the black paint on the tube
inside the tube run a steel rod and expose this through the bottom of the flask so it is exposed to the air
this will increase the cooling down of the air inside my original idea was to stick a small heatsink at the end of the flask
touching the metal rod

mick

It would probably be surprising if they agreed exactly. The only way of testing which is ‘correct’ is to calibrate against a standard (as Mick says), and that would most likely show that both are slightly off ‘true’. Maybe one is slightly high and the other is slightly low. That’s only to be expected. This is just ‘home grade’ equipment we are using after all. If you want +/- 0.1C accuracy you need more expensive equipment, and you need to send it away regularly for calibration[1]. Without that the best we can hope for is reasonably accuracy and, if we are lucky, long term stability.

Getting back to ‘Temp in a Jar’ calibration. What you have found is that when it is dark (0% Solar) the difference really is around 1.1 C (i.e. like your original ‘Low’)!

[1] I once worked in a temp calibration lab. ‘Quality’ thermometers are expensive, and even the expensive ones need regular calibration against better ones (and / or against physical constants) if you want to remain confident. Things may have changed now but when I was working there the ultimate check was for ‘master’ sensors to be sent off to the National Physics Laboratory for calibration against their standards!

I am the Manager of a UKAS (United Kingdom Accreditation Service) Accredited calibration laboratory, and temperature is one of the areas that we deal in. We have very expensive (>

[quote author=IanF link=topic=32624.msg267864#msg267864 date=1216039279]
Generally I’ve found the OS temperature sensors to be within +/-0.5

Thanks for the advice and info guys…

David, are you saying the 1.1 was correct for the low setting?
Because i had a play yesterday and set the low to 0.3 and high
at 2.2 and it looks to be working very well, Its now recording something
when its overcast as before it didnt, it was always 0%, its over cast now, well
near as damn it, its more very cloudy than overcast and its showing a reading.

Please see the image below and you will see how its performed today, its looking
pretty responsive to me?

Please ignore the UV readings, major issues with that at present, its a connectivity
issue, why its started these last few days i will never know :frowning:


“Correct” in this case is “whatever gives you sensible outputs”. :lol:

Your figures from last night suggested 1.1 as a ‘Low’, but maybe as you say it just happened to be warm ‘up the pole’ at that time. If you are getting good results with a lower ‘Low’ then probably a good idea to stick with that for a while and see how it goes.

If it ain’t broke, don’t fix it… :smiley:

Sounds good to me, will see how goes now, thanks m8 :slight_smile:

Hi there,

Really interesting topic. I learn a lot from it! Before this, I thought it would be a good idea to place a second sensor int he shade and put a few solar energised LEDS next to the sensor. When the sun shines, the LEDS would go on and warm the sensor up. When it becomes overcast, they would emit less light and heath. I thought this would be good to avoid the residual warmth of the jar you use. But I guess a small jar or bulb is may be a better solution. The graphs speaks volumes and for such a low cost solution it seems to perform exceptionally well!

I do not understand why you should calibrate two sensors to a third, calibrated one. If what you want to compare is the delta T between them. I would take the two sensors, put them together and measure the difference in exactly the same situation and note the difference. Best to use some points. -20 degrees (fridge), 0 degrees (icewater) and maybe both in the sun in the jar to give a reading at about 20-30 C.

This is because both sensors won’t behave lineair. So deviations will be different at each point and the Delta T will be different also. So in the winter you could change the 0-solar radiation point if the difference between the two at 0 or -20 C and +20 is somewhat different.

Second point: the accuracy of Oregon stations. Coincidentally I read a German test of weatherstations, with the OS 928 sensors among them it noted exactly the same deviations. This was tested in a lab. They noted a maximum deviation of -0,5 and + 0,4 C for the outdoor sensor. But when the original sensorscreen was used, outside, the temperature difference became 7 C on some occasions.
May be even better was the hygrometer with a maximum deviation of -1 % and + 0,8 %. That’s really astonishing. Note than WS 3600 for instance did worse on temperature (-0,7 an +1,9 C) and, although second best in a testfield of 4 stations, had deviations of -10,2 and + 1,7 % in relative humidity! Finally: the barometer did excellent also. -0,7 hPA/ +1,28 hPA as maximum deviations. Again much better than the other (La crosse 3600 had -9,6/0 hPa as maximum deviations for instance).

But the weak point of the OS was the windmeter with a maximum deviation of -2,2 m/s…0 m/s and the raingauge that had deviations of -5,7 mm/ 0 mm. The -5,7 mm is the real weakpart as OS measures only at 1 mm intervalls. A lot of rain will be lost. The range was 44-77 m, depending on the sensors in openfield which is also worse than other stations. I have to say that the range is a bit suprising to me. In another German test by other testers, they had the same outcome. The temp+ humidity + barometer had the same precision, which is oke. Only in this test the range and reliability was rated as excellent. I guess that is the outcome of what I read on different fora. And those other stations were rated less reliable with more sensor transmission failures…so may be this depends on the unit used (?).’

Another thing is that the sensors where the most powerfull of all. -21,6 dBa. Others where close, LaCrosse was at -17,7 dBA and clearly worse. So may be the receiver was to blame here.

If it comes to temperature, humidity and barometer readings you can’t get anything better. Davis is not better at these points, I own a couple of Davis stations and have thermometers in the same radiation shield (stevenson screen) with a calibration certificate and the deviations where also 0,5 K, sometimes a bit higher. Davis does a much better job when it comes to rain and wind though. I read that the WMR-200 does a better job with wind than the 928, it seems to turn much easier in low winds.

Does anybody know how accurate the UV metres from OS are? Never saw them tested…

I think the price/performance ratio of OS, although difficult to measure is on par with Davis.