ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

DonationCoder.com Software > N.A.N.Y. 2011

NANY 2011 :: Entry Ideas & Requests

<< < (6/8) > >>

NinJA999:
I have an idea: a personal spending habit report tool.
Now that Wesabe is gone, I'm looking for an app where you can tag, catagorize and report on your spending and income using exported bank statements from online banking sites. So you import a series of bank statements, catagorize them and report so you see via graphs where your money goes and how spending habits have changed. It would be an app that would have a big market.

what was great about wesabe for me was that it recognised similar lines in bank statements so the second time you shop at shop x it would already catagorize that line for you.
-justice (August 05, 2010, 04:11 AM)
--- End quote ---

Sorry to reply so much later (just saw this via the newsletter) but have you tried Mint.com?  I use it for *exactly* what you describe.  And it can automatically frob the information from the bank accounts as transactions post -- so before you even get a statement!  You should really check it out.

And no, I don't work for Mint.  I've just been using it for a couple of years and it's super cool to be able to track my spending habits over those months.

mouser:
Here's a new NANY idea/request:

URL Speed Tracker:

Idea: A utility meant to continuously track the speeds of opening certain web pages/downloading files.

Motivation: It can be important, especially when setting up a new server, to be able to watch the speed and watch for any failures of pages to load.  This program will continuously grab pages and track speeds over time to let you know the health of a server.


* The main interface should be a grid where you can add new urls (of pages or files).
* Then a set of columns for running speed averages, averaged over longer windows (i can elaborate later).
* A button to toggle on/off fetching.
Options:

* 1. how often it should recheck urls
* 2. it should save/load the last set of urls.
With regard to tracking over longer averages:

* A really deluxe version would be able to plot the speeds over time of a given url.
* But short of that, what one wants is some indication of avg speed over different time windows.  So for example it would be nice to know last speed (avg over 1 try), and then over maybe the last 10 tries, then over last 100, over last 1000, etc (the last value would give you a true baseline).
* Rather than keeping track of every download speed (needed if you want to graph), a simple heuristic solution to these would be to keep track of, for each url, a decaying average with a different decay rate.  ie DisplayValue = DisplayValue*Decay + NewValue  where Decay is from 0 to 1, and the lower the value the shorter the time window for averaging.
I could really use such a thing.. any takers?  if not i may code it.

Bonus:
I can see how this would be useful for cross-platform environment, and specifically if it could be run continuously in commandline mode on a linux server.  But for me right now a windows-only version would be fine.

Gothi[c]:

I can see how this would be useful for cross-platform environment, and specifically if it could be run continuously in commandline mode on a linux server.  But for me right now a windows-only version would be fine.

--- End quote ---

Linux already has this kind of..


--- ---time wget -p http://somedomain.com

JavaJones:
I would love to have an app like that too, but a Windows solution would be needed for me.

- Oshyan

NinJA999:
Here's a new NANY idea/request:

URL Speed Tracker:

Idea: A utility meant to continuously track the speeds of opening certain web pages/downloading files.

Motivation: It can be important, especially when setting up a new server, to be able to watch the speed and watch for any failures of pages to load.  This program will continuously grab pages and track speeds over time to let you know the health of a server.


* The main interface should be a grid where you can add new urls (of pages or files).
* Then a set of columns for running speed averages, averaged over longer windows (i can elaborate later).
* A button to toggle on/off fetching.
Options:

* 1. how often it should recheck urls
* 2. it should save/load the last set of urls.
With regard to tracking over longer averages:

* A really deluxe version would be able to plot the speeds over time of a given url.
* But short of that, what one wants is some indication of avg speed over different time windows.  So for example it would be nice to know last speed (avg over 1 try), and then over maybe the last 10 tries, then over last 100, over last 1000, etc (the last value would give you a true baseline).
* Rather than keeping track of every download speed (needed if you want to graph), a simple heuristic solution to these would be to keep track of, for each url, a decaying average with a different decay rate.  ie DisplayValue = DisplayValue*Decay + NewValue  where Decay is from 0 to 1, and the lower the value the shorter the time window for averaging.
I could really use such a thing.. any takers?  if not i may code it.

Bonus:
I can see how this would be useful for cross-platform environment, and specifically if it could be run continuously in commandline mode on a linux server.  But for me right now a windows-only version would be fine.

-mouser (December 18, 2010, 06:55 AM)
--- End quote ---

I've been kludging something together for this, and since I've spent a little time on it I figured I'd pledge.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version