I have been checking my January log file 12009lg.txt and have found several duplicate records in it, the time stamp and fairly static data like temps and rain are the same but in most cases things like the wind gust or wind direction are slightly different, say 244 and 242 degrees for wind direction. These duplicates are spread across different days and at different times and only happen as a pair and the it goes for days with no duplicates so there is no pattern as to when this occurs. I should also point out there the next minute record is there so its not that the time stamp is wrong it jsut looks like two records being written in the same minute. This would seem to indicate to me that these records are being written by WD and not simply duplicated by some other factor. I’d be interested in knowing if anyone else is seeing this happening? Brian have you any ideas as to why this might be?
Edit: I just checked February’s log and it has some duplicates as well.
also should have said I’m running 10.37N build 6 and it happened on this level today.
I came across this issue recently as WD cannot deal with more than 44640 lines in a single logfile and the duplicates were adding too many extra’s.
I have a WMR200 and I think the duplicates may be occurring when WD downloads data from the console memory. Problems with live data would be a different issue.
You can use excel to very simply remove any duplicate lines in a single hit with no searching or checking required. I certainly haven’t bothered to chase extra lines with minor differences at this stage - just needed to keep the file below the magic 44640 number.
I have a Davis VP1 and so far I’ve not found any problems which seem to be related to this. If it were just the log file it would be easy to remove duplicates, however the problem is also there in the data (.inf) files which are not editable. I think we need Brian’s opinion on the problem and what can be done about it as I’d like to preempt problems rather than waiting for them to happen.
As I said these are extra records with some different data, usually something which changes quickly like wind dir or speed which can change on a VP every 2.5 seconds.
I only have 4 logfiles currently and have run the logchecker on 3 of them. How did you determine if the lines were dupes? I mean, is the logchecker looking at timestamps? Or did your temp change enough to cause a hit by the logchecker? Otherwise I don’t see how to recognize the lines as dupes…
The latest version of the log checker (I think you may need to download the last shown file rather than the first) checks for duplicate records by comparing date/time stamps and find (for me) several occurrences of duplicate records in several of my log files, and when I look at the log file itself at that time stamp there is indeed two lines with identical times but fast changing data like wind speed/gust and direction may be slightly different between them. On checking the data log files (.inf) I find the same data (I have not yet checked all .inf files). Anyway this to me suggests that this has to be WD writing two records with identical time stamps to both lg.txt and the equivalent .inf file, the chances of this being done by anything other than WD is in my view extremely remote. I believe you need version 0.40 of the log file checker for finding duplicates.
After checking my error logs I found many hours of dupes in my December 2008 file…so yes, the logfile checker does find these entries ok.
In my case, and this has happened before, WD continues to write the log file but does not update the date and time. All the other readings appear to be valid, ie. temp, humidity, etc. But the date/time does not increment. January log file did not contain any of this problem and I haven’t checked February yet.
My wx station does not log data so no import missed data feature. My guess is my problem is not related to your problem…but I will be using the logfile checker to keep an eye on my logfiles. If I notice anything similar to your issue I will re-visit this thread.
ideally there should be no more than 44640 lines in your logfile (if there is, then when you convert the logfile to data file, you will miss data past that 44640 data record)
checking some of my 31 day month files, and that has not happened
I would not worry about the occasional time stamp in the logfile that has not increased the minute from the time before
Its not going to be a problem for the logfile or the .inf data file
Well my January one this year is 44641 which means the last minute is lost. However if this happens a lot the you would lose potentially many minutes if I convert log file to data. However I can see that the data file still has the duplicate records in it so that has 44641 records as well as I can see the records for 2359 on 31st, so the issue here is the convert log files to data which drops records.
Since the extra records are definitely good data it seems a pity to drop them prior to any convert, would it be possible to fix it up so that the convert process will actually convert however many records there are in the log file to the relevant data file, since the data file can contain more than 44640 records?