I updated the NWS related scripts per an earlier post. Last couple of weeks worked fine (I was unaware this function existed so was pleased to find it). Yesterday and today we have a winter storm warning - nothing displays.
The website is:
Started to wonder if my web host is blocking something. Any suggestions welcome.
As it’s working from a browser but not a script this NWS response to a similar problem may apply…
Applications accessing resources on weather.gov now need to provide a User-Agent header in any HTTP request. Requests without a user agent are automatically blocked. We have implemented this usage policy due to a small number of clients utilizing resources far in excess of what most would consider reasonable.
We recommend providing a user agent string in the following format:
ApplicationName/vX.Y (http://your.app.url/; [email protected])
This will both uniquely identify your application and allow us to contact you and work with you if we observe abnormal application behavior that may result in a block.
I don’t know how you’d add the header to your script though.
Provider is saying the same (not blocked by them, its the NWS). Wonder if its possible the user-agent being used in line 1053 now is triggering this as the scripts are more heavily used. Unlikely I suppose as there would be more issues reported.
Its far more likely that too many requests per minute caused the NWS to auto-block your webserver’s IP address temporarily. Your webserver ip of p4.host-ed.me has 37 other sites on that server… maybe one of them was not ‘playing nice’.
Make sure that caching is enabled for the nws-alerts.php script set (and that cache files can be written). That will automatically reduce the number of API accesses needed.
If you have caching enabled, and the block persists, then it’s likely another site on your shared server is causing it.
You can also turn off checking temporarily, wait a few hours and enable it again to see if that fixes the issue (by having the NWS autoblock remove itself).
It’s being blocked by NWS. Had the same issue. So I had to drop them and switch to WU for updates and create a link to weather.gov for the alerts. Bots\Crawlers are creating to many requests.
It may be different for different NWS web sites, but I know that access to some NWS data is rate limited based on the IP address requesting it. So bots/crawlers shouldn’t affect your access too much unless they’re showing up as being behind the same proxy as you are.
IMHO a bot loading a dashboard will also start the NWS and other data-loads.
And a lot of ill designed bots/script-kiddies follow each link in the html.
Therefor you should use a cron if you have more then one or two links to nws-areas.
On hosting servers with a lot of user-websites with the same IP-address (such as free hosting) the use of an extra cron for even 1 NWS-alert-area is also necessary.
My website has a shared IP address (since requested unique IP). Seems someone was doing malicious stuff and the IP was flagged. I also use some scripts that load NWS forecasts for the area - on further investigation they ceased to work as well. They are now functioning - my host did something to shut down the malicious activity.
" Advisory Information Unavailable, error fetching or reading data from the NOAA advisories server."
I have had that error for weeks but didn’t try to deal with it until now. I still am missing half my home page which is more of a priority until I get that fixed.
It would help enormously if you guys would put links to your websites IN YOUR PROFILES, not just in one post. Anyone trying to help you has to trawl through the whole topic to find a link.
the status page should be telling you that updates are available but for some reason its not showing on the status page
noaa changed stuff earlier this year and the updated scripts or removal of are picked up in the latest check-fetch-times.php version
The latest version of check-fetch-times.php is (always) available at
It’s always a good idea to download from there, upload to your site and run check-fetch-times.php?show=versions to see what updates you currently need. Note: the check-fetch-times.php is the same version used on all the Saratoga templates (Canada, USA, World)
The displays of version info in the wxstatus.php were added were added to include-wxstatus.php Version 1.10 - 16-May-2020.
Keeping your site up-to-date is very important as hosters migrate to newer versions of PHP, and some PHP changes have removed/deprecated functions that were used in older scripts in the templates (and many 3rd-party scripts).
The NWS change to alerts.weather.gov on 30-Jan-2024 is the reason that two scripts (atom-advisory.php and atom-top-warning.php) were deprecated since it was too much trouble to rewrite them. Their replacement (nws-alerts) was rewritten to use the new ATOM advisories available from api.weather.gov.
Moral: if you let your website fall behind maintenance levels, you risk having it crash as PHP (or data source websites) change.