Here's a new NANY idea/request:
URL Speed Tracker:Idea: A utility meant to continuously track the speeds of opening certain web pages/downloading files.
Motivation: It can be important, especially when setting up a new server, to be able to watch the speed and watch for any failures of pages to load. This program will continuously grab pages and track speeds over time to let you know the health of a server.
- The main interface should be a grid where you can add new urls (of pages or files).
- Then a set of columns for running speed averages, averaged over longer windows (i can elaborate later).
- A button to toggle on/off fetching.
Options:
- 1. how often it should recheck urls
- 2. it should save/load the last set of urls.
With regard to tracking over longer averages:
- A really deluxe version would be able to plot the speeds over time of a given url.
- But short of that, what one wants is some indication of avg speed over different time windows. So for example it would be nice to know last speed (avg over 1 try), and then over maybe the last 10 tries, then over last 100, over last 1000, etc (the last value would give you a true baseline).
- Rather than keeping track of every download speed (needed if you want to graph), a simple heuristic solution to these would be to keep track of, for each url, a decaying average with a different decay rate. ie DisplayValue = DisplayValue*Decay + NewValue where Decay is from 0 to 1, and the lower the value the shorter the time window for averaging.
I could really use such a thing.. any takers? if not i may code it.
Bonus:
I can see how this would be useful for cross-platform environment, and specifically if it could be run continuously in commandline mode on a linux server. But for me right now a windows-only version would be fine.