forcast with the perl script from satatoga

Thank you Ken, i got it…the header… is about same in english and greek the difference is a dash/space between 5 and day…i saw that today #-o

hi edje i am not a programmer…i can not help more … you have en error those 2 lines …i did not change and work here…i run the script in a directory and make the files at same directory… try to delede de directory /forecast
$outfilenameHTML = ‘./forecast-5day.htm’; #XHTML 1.0 compliant (hopefully)
$outfilenameHTMLFULL = ‘./forecast-7day.htm’; # Full WU 7-day forecast

To cure the errors in the script just # before the command in as follows

Useless use of a constant in void context at line 144. do this #EOH2

Name “main::LOG” used only once: possible typo at line 28. do this #close LOG;

Name “main::quiet” used only once: possible typo at line 14. do this #$quiet = 0;

The script will run without errors

The paths are not correct on in your for error : “unable to open ./forecast/forecast-5day.htm: No such file or directory” etc which is why the script does not work.


In your script the url is not correct for Amsterdam

you have my($req) = new HTTP::Request(‘GET’, ‘, netherlands’);

it should be my($req) = new HTTP::Request(‘GET’, ‘’)

You need to set the ./forecast-5day.htm to read/write better yet set the whole folder as read/write till it works this is a permissions problem.

Ok Sorry got excited

use this


Ok I am running the script on my local pc & web server works OK on both – Amsterdam weather.!!!


What does it mean when my CGIs fail with “Premature end of script headers”?

It means just what it says: the server was expecting a complete set of HTTP headers (one or more followed by a blank line), and didn’t get them.

The most common cause of this problem is the script dying before sending the complete set of headers, or possibly any at all, to the server. To see if this is the case, try running the script standalone from an interactive session, rather than as a script under the server. If you get error messages, this is almost certainly the cause of the “premature end of script headers” message. Even if the CGI runs fine from the command line, remember that the environment and permissions may be different when running under the web server. The CGI can only access resources allowed for the User and Group specified in your Apache configuration. In addition, the environment will not be the same as the one provided on the command line, but it can be adjusted using the directives provided by mod_env.

The second most common cause of this (aside from people not outputting the required headers at all) is a result of an interaction with Perl’s output buffering. To make Perl flush its buffers after each output statement, insert the following statements around the print or write statements that send your HTTP headers:

 local ($oldbar) = $|;
 $cfh = select (STDOUT);
 $| = 1;
 # print your HTTP headers here
 $| = $oldbar;
 select ($cfh);

This is generally only necessary when you are calling external programs from your script that send output to stdout, or if there will be a long delay between the time the headers are sent and the actual content starts being emitted. To maximize performance, you should turn buffer-flushing back off (with $| = 0 or the equivalent) after the statements that send the headers, as displayed above.

If your script isn’t written in Perl, do the equivalent thing for whatever language you are using (e.g., for C, call fflush() after writing the headers).

Another cause for the “premature end of script headers” message are the RLimitCPU and RLimitMEM directives. You may get the message if the CGI script was killed due to a resource limit.

In addition, a configuration problem in suEXEC, mod_perl, or another third party module can often interfere with the execution of your CGI and cause the “premature end of script headers” message.

In other words a server adjustment is required or maybe run the script differently I run the script with a cron using a short script as below placed in the forcast folder





Hm have to read that once again but on the good way I see.

were to place this more precisly ?

local ($oldbar) = $|;
$cfh = select (STDOUT);
$| = 1;
# print your HTTP headers here
$| = $oldbar;
select ($cfh);

Ok lets go back to basics

Can you run this script on you local pc?? If yes then the script is OK

If no then there is a script error I can email the script I am using to you and then try again.





Hi Steve, i got the cron sorted…

Peeps, steve sorted this how for me and works a treat, just wanted to mention that :slight_smile:

I have 2 working cgi test files in cgi-bin.
I download them with ws-ftp / auto
rename them to test2.cgi
nothing more

upload them
and got an internal error

Running it from the root dir (server) only the error remains

Software error:
…unable to open ./forecast-5day.htm: Permission denied

bedtime next time better, all thnx for suggestions.

Ken ! help me out with another script !

please… :slight_smile:

Out of curiosity…

What have you tried to correct the error?


I’m a little confused. Are you trying to run the as a CGI on your webserver? If so, the script wasn’t really designed to do that as it’s missing the parts to write the HTTP headers. The (as written) can be run by a cron process on a server (or a windows scheduler process on a PC with ActiveState Perl installed) to create the two files with the contents of the WU forecast for your area.

How were you intending to use the script?

Best regards,


Why are you running the script as a CGI can you not just run with the bash script? am I missing something here?


Asking Panos how and what Now it’s clear to run it with Perl Package Manager

Sorry for wasting all of your time, :oops: I’ll buy you a box havana’s
To get here at the river Zaan with sight on nice mills woodenshoes …hm

Thanks again all, learning some more about searching reading asking trying


To keep the topic readable for others I removed some of my replies :wink:

One thing to ask: off topic a bit but…

what program do you use best for ftp - up/downloading scripts

When I download a testfile.cgi what’s working at the host
rename the file local (not opened) and upload it with ws_ftp I have an internal error.

Next step. How to get it in a wx*.php file. Any script avaible for it ?
Searching at Kens pages yet. Not shure it’s there (but shouldn’t be surprised it is

So how to get the output from in that wx.php scripts
Changing both outputs to wx.php ?


The easiest way to get the info into your wx*.php page is to use the built-in PHP include(); call. Put the include() call in your wxlocal*.html page where you’d like the page to appear like this:

 <?php include("forecast-5day.htm"); ?>

That assumes you’re running the Perl script via cron, and ftp uploading the forecast-5day.htm and forecast-7day.htm files to your website.

You may have to play with the script to have it omit the and code from the script.

Hope this helps…
Best regards,

Got it !

have to pimp it up :smiley:

thnx Ken

So I now have some wxlocal*.html files
WD each hour runs it to wx*.html

I open the file with notepad en safe it as wx*.php
upload and voila

as WD can upload it auto as html to my host
how to run a cron to rename all wx.html files to php :

rename wx*.html wx*.php ?

It doesn’t run in customize internet and file creation setup (see att)