Application Name | Speed Monitor |
Version | 1.0 |
Short Description | Continuously tracks the loading speed of websites with an adjustable check interval and averages from the last 10, 100, and 1000 tries. Graphs the last 100 tries. Idea from mouser's post. |
Supported OSes | Windows with .NET |
Web Page | Soon... |
Download Link | v1.0 - 86K EXE OR Download from Softpedia OR Download from Softoxi |
System Requirements | |
Version History | - v1.0 - Added drag-and-drop, clear data button, slower default interval, and quick hint
- v0.9 - Charts added
- v0.8 - First-time lag removed, URLs no longer require "http://", small other tweaks
- v0.7 - Save/Load of URLs added
- v0.6 - Basic functions running (with averaging)
- v0.4 - Single data point acquisition added
- v0.2 - Timers added
- v0.1 - Basic UI
|
Author | NinJA999 (Nick Aldwin) |
DescriptionContinuously tracks the loading speed of websites with an adjustable check interval and averages from the last 10, 100, and 1000 tries. Graphs the last 100 tries.
Idea from
mouser's post:
Here's a new NANY idea/request:
URL Speed Tracker:
Idea: A utility meant to continuously track the speeds of opening certain web pages/downloading files.
Motivation: It can be important, especially when setting up a new server, to be able to watch the speed and watch for any failures of pages to load. This program will continuously grab pages and track speeds over time to let you know the health of a server.
- The main interface should be a grid where you can add new urls (of pages or files).
- Then a set of columns for running speed averages, averaged over longer windows (i can elaborate later).
- A button to toggle on/off fetching.
Options:
- 1. how often it should recheck urls
- 2. it should save/load the last set of urls.
With regard to tracking over longer averages:
- A really deluxe version would be able to plot the speeds over time of a given url.
- But short of that, what one wants is some indication of avg speed over different time windows. So for example it would be nice to know last speed (avg over 1 try), and then over maybe the last 10 tries, then over last 100, over last 1000, etc (the last value would give you a true baseline).
- Rather than keeping track of every download speed (needed if you want to graph), a simple heuristic solution to these would be to keep track of, for each url, a decaying average with a different decay rate. ie DisplayValue = DisplayValue*Decay + NewValue where Decay is from 0 to 1, and the lower the value the shorter the time window for averaging.
I could really use such a thing.. any takers? if not i may code it.
Bonus:
I can see how this would be useful for cross-platform environment, and specifically if it could be run continuously in commandline mode on a linux server. But for me right now a windows-only version would be fine.
-mouser
FeaturesMultiple websites
Minimizes to tray
Editable frequency
Last 1/10/100/1000 averages
Enable/disable switch
If a website is unreachable, time turns red (and reads -1)
Automatic saving/loading of URLs
Charts of the last 100 measured times
Drag-and-drop for URLs
Easy data reset
Planned FeaturesTimeline on charts
Better operation (less memory-intensive, etc.)
ScreenshotsUsageInstallationRun the EXE
Using the ApplicationFrom
Help->Explain This!Use this tool to monitor the average length of time it takes to load a webpage or file.
Enter a URL into a row, then click "Enable Fetching" to start gathering data.
The last duration (in milliseconds) will appear in the "Last (ms)" column, the average duration over the last 10 tries will appear in the "Last 10" column, and so on.
Click the "Chart" button to see a line graph of the last 100 measured times.
You can change the frequency of tries (in seconds) by editing the value in the "Frequency" text box and clicking "Set".
Each time you add a new URL, a new blank row will appear.
To delete a row, select it and press the [DELETE] button on your keyboard.
You can also add a URL to the list by dragging it and dropping it onto the list.
To clear all time data for the URLs, click "Clear Data".
If you minimize the program, it will disappear to the small icon in the system tray. Double click the icon to restore the window.
URLs are persisted between program runs.UninstallationDelete the EXE and the smu file.
Known IssuesInitial time always seems to be too high. Fixed! It was a silly .NET bug.
NOTE to those experiencing warnings from their security software:There is no malicious code in this program. It may be flagged due to its high amount of network traffic; this high level of traffic is due to the nature of the program.
This program has been rated 100% clean by Softpedia.
It has also been verified as clean from any malware by Softoxi.