topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Wednesday June 25, 2025, 1:13 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Recent Posts

Pages: prev1 ... 40 41 42 43 44 [45] 46 47 48 49 50 ... 106next
1101
Living Room / Re: What's Your Internet Speed/Reliability SATISFACTION?
« Last post by JavaJones on November 17, 2010, 02:11 AM »
I don't find speedtest.net's results to be the most reliable either, but sometimes they sure can look good!



That's from a server in San Jose, but oddly I got a result around 4mbit/s download (similar upload) to a server in SF upon repeated attempts. So clearly the server matters *a lot*. In fact the SF result is so bad, I'd have to guess either a temporary bandwidth condition on the part of that test site, or some throttling on their end (I'm leaning this way). Here's a more realistic result from NY, cross country, not too bad:



and Baltimore, MD surprisingly good:



In reality I have a 22/5 connection, Comcast Business Class. It's $100/mo here in San Francisco. For the bandwidth I get, I have few complaints. Uptime is very good overall. And with Business Class they don't mess with my connection in any way - unthrottled, no protocol limiting, no bandwidth limits, nothing. I've downloaded over 1TB in a month before, and not a peep from them. :D

I usually use SpeakEasy (an ISP) for their speed test and find it more consistent and realistic. But they only have US test servers, and there's no image to link to, you just have to copy/paste the text results like this:
San Francisco Server:
Download Speed: 46388 kbps (5798.5 KB/sec transfer rate)
Upload Speed: 7523 kbps (940.4 KB/sec transfer rate)

New York Server:
Download Speed: 7239 kbps (904.9 KB/sec transfer rate)
Upload Speed: 3927 kbps (490.9 KB/sec transfer rate)

Sometimes even a remote site will surprise you.
Washington DC:
Download Speed: 34149 kbps (4268.6 KB/sec transfer rate)
Upload Speed: 4234 kbps (529.3 KB/sec transfer rate)

Anyway, I'm pretty happy with what I've got. My greatest condolences to those in this thread with really bad connections, ouch! But I guess it just means fewer inane Youtube videos mostly. ;)

- Oshyan
1102
Living Room / Re: Would someone mind recommending a good laser printer?
« Last post by JavaJones on November 16, 2010, 10:56 PM »
Ok so at my last job, rather than have a single monster machine networked that everyone used (or one per floor), most everyone had a printer of some kind on their desk. This gave me the chance to work with a lot of different models and brands, almost all of them laser printers. Here are my thoughts:

Samsung: Very good overall, some of the least problems and most consistent operation. I'd probably give the nod to Samsung for the best of the brands we had. We had a number of older Samsungs that outlasted newer purchases from e.g. Brother. I also have a Samsung at home that's been pretty good. Fairly low volume, but still haven't had to replace the toner in maybe 2 years.

Brother: Good printers overall, some jamming and streaking issues, definitely "disposable" oriented. We had a few with problems and ultimately ended up just scrapping them - repair cost wasn't worth it. Mind you we bought more Brothers than almost anything else so they were bound to crap out in higher numbers (though I don't think a higher *percentage* did).

HP: Not bad, though they used to be a lot better. We didn't buy many of them, the few we had weren't outstanding (although not miserable either). I have the same complaints about bloated drivers and whatnot though.

Xerox: Only minimal experience, was ok, but honestly I don't think they compete much in the consumer space anymore.

Canon: Really the best for inkjet when you factor in speed and quality. This is both from work and personal experience. I recently got a Canon MP640 multifunction and couldn't be happier. It's fully networked so I can print and *scan* from any computer on my home network. Very slick.

Which leads me to: Wifi in printers is NOT a gimmick! There are a few reasons why.

First, many printers still do not include basic network (RJ45) support, so to get them on a network you need to either buy a print server (more expensive than the added cost of wifi in a printer these days - it's come down a lot the last year or so), or you have to hav them connected to a computer that's always on. Add to that my experience with the best wireless print server I could find - which is not very good - and I have to say the native wifi experience is MUCH better.

Second, when a manufacturer builds wifi in instead of relying on connection to a PC or a print server, they tend to tailor the functionality to working nicely across the network. My Canon MP640 is a fantastic example. Scanning across the network is just like scanning locally. I can open up a scan app from my laptop on wifi and scan just like I would on my desktop. Alternatively I can go to the unit itself and scan something and send it to any machine on the network with the driver installed. The UI is pretty slick and everything works fast. Win!

Third, the cost of wifi added to printers these days is generally not that much. Here's an example:

The Samsung ML-2525 (no wireless) - $119
Samsung ML-2525W (wireless included) - $139

That tends to hold up in the consumer space pretty well, i.e. moving into the inkjet realm, you might see a $10-$20 difference on a ~$100 for adding wifi. A larg proportion of the Canon MFC line just comes with wireless now by default. I think more manufacturers are probably going that route, and it's just fine with me.

Finally, wireless is actually quite useful if you do want to share a printer with a household or small work group, and want freedom in where to position the thing. Not near a network port and don't want to string a cable over to it? No problem!

Really, for the small added cost, wireless is *great*.

- Oshyan
1103
Living Room / Re: Cute Parody Ad
« Last post by JavaJones on November 16, 2010, 10:08 PM »
BAHAHAHAAH. Yes!!!! That's so, so satisfying.

- Oshyan
1104
Living Room / Re: Laptop choice: better CPU or more RAM ?
« Last post by JavaJones on November 16, 2010, 09:26 PM »
Believe it or not in this case the higher-numbered CPU might actually be lower performance. It's 200Mhz per core slower, as you can see on the specs, but does have a 2MB instead of 1MB cache. Larger caches only matter for certain kinds of workloads though, and in general do not come into play for average "office tasks". You can see a benchmark result comparison (for only one benchmark admittedly) here:
http://www.cpubenchm...re+T4500+%40+2.30GHz
http://www.cpubenchm...uo+T6570+%40+2.10GHz
and a couple more:
http://www.notebookc...ocessor.25727.0.html
http://www.notebookc...ocessor.34898.0.html
Note that in those benchmark results, SuperPi is one of the few places you'll see the T6570 pulling ahead, most likely due to its cache.

Now, even if the "lower" CPU weren't faster in the majority of tasks, I'd still recommend more RAM, especially if you're getting Win7. While Win7 doesn't seem to require as much RAM as Vista, it still runs better with 4GB than 2GB. Combine that with the fact that the first option is cheaper, has more HD space, and a memory card reader (dunno if it's important for you but I won't buy a laptop without one), and I think the choice is clear.

- Oshyan
1105
Presumably your incoming connection goes through a router/gateway/thingamajig of some kind. What kind? Does it have any nifty management capabilities? Possibly that has bandwidth monitoring, perhaps even per-user...

- Oshyan
1106
General Software Discussion / Re: extracting info from pdf
« Last post by JavaJones on November 16, 2010, 08:26 PM »
I'll admit I only skimmed this topic, so this may be wildly far of the mark, but what about opening in Photoshop and rasterizing at desired resolution, then cropping? Given that screenshot approaches are being talked about here, I take it having the tables, etc. in actual original format (i.e. being translated into a Word table) is not a requirement, thus the Photoshop approach should work well (if I've understood the requirement from reading first few posts and skimming the rest  :-[).

- Oshyan
1107
General Software Discussion / Re: Crashplan backup software
« Last post by JavaJones on November 16, 2010, 02:13 PM »
I've been looking at these guys recently for backup after I ran into a rather dramatic data crisis recently. They're one of the few that offer a "data seeding" option where for a fee they'll send you a drive you can put most/all of your data onto and then you send it back and they "seed" your backup. You can do this with up to 1TB of data for around $125, shipping paid both ways, and when you have 1+TB of data this is actually a really valuable and potentially worthwhile service. Their service cost is also quite reasonable and they support "unlimited" data and I have good reason to believe they really will support 1+TB of data within their normal fee structure (which is cheap). They'll also send you a physical copy of your backup data if desired as an optional extra cost service, again very nice if you need to restore in a hurry after a total loss. Basically what I've been looking for is a way to get my data backed up in as many locations as possible, preferably at least one "off-site". These guys seem like one of the best, most cost-effective ways to do that.

- Oshyan
1108
Mini-Reviews by Members / Re: Quick Comparision of Flickr/ipernity/PicassaWeb
« Last post by JavaJones on November 16, 2010, 01:05 PM »
Flickr's recent (as in over the past year) photo display changes have improved its standing in my mind, but Smugmug still rules the roost for quality of presentation. Flickr always makes me feel like the photos being shown are crappier than they are somehow, just by nature of my dislike of their UI. Can't quite put my finger on it. But I guess it's (mostly) just me since it's so damn popular.

Picasa I like well enough, the features and integration with the desktop apps is great. The photo presentation is not the greatest, but IMO better than Flickr. The zoom function is not my favorite, but it's workable. Face recognition is nice. The social aspects aren't nearly as well developed as Flickr, I'll grant. For the moment I do use Picasa as my primary photo sharing tool.

Permissions are not so much a concern for me, but Jibz your use-case is rather interesting and inspires me to consider working that way in the future. I will look at whether Smugmug allows that.

- Oshyan
1109
Living Room / Re: How to understand all the Intel chip types?
« Last post by JavaJones on November 16, 2010, 03:35 AM »
Yeah, I never quite got the "slowest piece sets the score" thing. Granted your slowest hardware could greatly influence your overall performance, depending on what you use your system for (e.g. if you don't do gaming, a slow graphics card may not matter much, though see recent trends with GPU acceleration of common web browsing for some caveats...). I do think though that the individual scores, particularly CPU, could still be valuable.

- Oshyan
1110
Living Room / Re: The Story of Stuff - Cosmetics, Bottled Water...
« Last post by JavaJones on November 16, 2010, 03:32 AM »
If you can find an economically viable way to auto-sort trash I reckon you've got a nice little startup business on your hands! :D

- Oshyan
1111
Post New Requests Here / Re: Idea: Productivity Suite and Monitor.
« Last post by JavaJones on November 16, 2010, 02:47 AM »
I reckon if you required the coders to use a Version Control System (source manager), you could monitor their check-in and check-out activity. There are basic "activity level" graphs for most such systems, and I'm sure there are more advanced activity analysis modules/plugins for e.g. SVN, Mercurial, Git, etc. Use of a VCS is advisable for any coding project anyway, and most professional coding teams/groups should already be using one. They might not like you requiring them to use yours, but again you can just make it part of the terms of your contract.

- Oshyan
1112
Living Room / Re: Hidden Netflix Marathon Gems to Watch Online
« Last post by JavaJones on November 16, 2010, 01:04 AM »
What system are you playing it on and what is the graphics card used? What is your normal media player and do you have the same problem there? (presumably not) If you did at some point have a similar problem in other media players, how did you correct it? It's possible Silverlight is using "overlay" mode or some other specialized video streaming display system, which may be controllable by advanced settings in a graphics card driver.

- Oshyan
1113
General Software Discussion / Re: change internet speed on the fly
« Last post by JavaJones on November 16, 2010, 12:19 AM »
There are also a few other possible scenarios. 1: his son had a bandwidth throttling app on his system (these exist, here's one), or (perhaps more likely?) 2: they have a router that they both connect to that has a throttling implementation of some kind, and he adjusted the priority there. Perhaps a device running DD-WRT or Tomato.

- Oshyan
1114
Living Room / Re: Two broadband connections at the same time?
« Last post by JavaJones on November 16, 2010, 12:15 AM »
Note that, as far as I'm aware, this works the same way multi-core CPUs do: you need multithreaded workloads to see the difference! In other words you can *not* download 1 single file at twice the speed using both connections simultaneously. But you could download 2 files, both at 2mbit/s, saturating both lines, each with 1 file. If that sounds like what you want, then a load-balanced router is probably the best way to go. That way all the machines on the router can benefit, and you're more likely to take full advantage of both connections that way too.

- Oshyan
1115
Living Room / Re: Tipping - Why does this appear to be a "requirement"?
« Last post by JavaJones on November 15, 2010, 11:22 PM »
Very interesting thread. My girlfriend has been a server for about 7 years, she's actually good at it, but of course doesn't want to do it the rest of her life. Fortunately she's getting out of it now. But I digress. For obvious reasons I have a lot of understanding of and sympathy for the situation servers and other restaurant staff are in. It's an industry-wide systemic issue, and yes it's unlikely to change any time soon. So the thing is, I agree in principle with Josh, but I know that reality has other plans, so I tip - I tip fairly generously in fact. Usually no more than 20%, but seldom less than 15%! It *is* comensurate with service however.

The comments about service potentially being worse in other countries are very interesting for me. As I said I agree with Josh and have long wished for a more normal wage situation in food service, both for my needs as a customer, and for the stability and sanity of those employees. I've never really thought that maybe service *is* actually better as a result of our otherwise totally f*****d system though. Obviously a few anecdotal reports are not evidence enough, but it's intriguing at the least. Wraith's comments are particularly interesting, and echo my experiences with a significant other in the industry as well, i.e. not tipping just hurts the servers, it will never effect actual meaningful change.

I actually had vague notions of opening a restaurant at some point and trying an experiment: pay people good wages *and* give them benefits. The catch? I as the owner would not make any (or at least much) money off of it. A lot of owners, at least of successful restaurants, *do* make decent money off it; sometimes very good money. What do they do to justify it? Depends on the owner, certainly. But in many cases not much! Of course they did a lot more to begin with - they usually funded or at least managed it to start, they had the idea, they put in the work early on and got it started. And for many owners there can be a lot of ongoing maintenance, at least if they're not willing/able to hire a good manager. But anyway, I'm still curious just what it would cost to have a restaurant paying people decently, and not expecting tips... I don't really think 300% increase in food costs is necessary, quite honestly. But then I haven't done the math. I guess one interesting point is that the money is already there and has to come from somewhere. People pay *at most* 20% on average for tips, probably more like 10-15% is the average, and servers industry-wide basically "make ends meet", along with the bussers, cooks, dish washer, etc. So if that's the case, er, surely increasing the cost of everything 20% would do it??

- Oshyan
1116
I dunno, I've been genuinely excited to announce lots of stuff as part of a company before. I've been responsible (or at least partly) for some of those "we're excited to announce" messages before, and yes, we were in fact genuinely excited. I see no reason to assume Google or even MS would not be, at least sometimes. Maybe not with every "we're excited to announce" announcement they make though. ;)

Oh yes, and good on MS for doing this! It's a great boon to small businesses and non-profits. Especially as MSE is actually a very decent product as compared to e.g. Mcaffee or Norton, both of which are otherwise more likely to get small business sales.

- Oshyan
1117
Living Room / Re: Nokia CEO admits that the cell phone industry is a gimmick.
« Last post by JavaJones on November 15, 2010, 08:32 PM »
There are some additional complications in the US when it comes to network/frequency/spectrum support which do make it actually technologically more difficult to make a single phone which supports all major carriers. That is of course in addition to the carriers making this difficult or impossible since only 2 of the 4 major carriers (AT&T and T-Mobile) support SIM cards. With Spring and Verizon (the other two), it's much more difficult to bring in a phone bought out of contract or from a different carrier.

That being said, it doesn't have to be this way. With the coming higher-speed network deployments ("LTE"), they have the opportunity to standardize more. Each carrier tends to own a different portion of the spectrum and has vested interest in keeping their customers on their spectrum and away from other carrier's service, so this is very unlikely to happen (in fact it's already not happening - Sprint and T-Mobile have both already rolled out "4G" networks with different technologies and spectrums. But it *could* be more standardized...

And what about explicitly allowing alternative OSs? Great I guess, but useful only to a small minority of purchasers and so clearly not a priority for handset manufacturers, especially when you consider issues like DRM control necessary for current content streaming agreements (e.g. Netflix). Having your customers able to use a service like Netflix seems a lot more important than enabling some DIY people to tinker (and, granted, produce some cool stuff).

Interesting though how some businesses do "get by" on "mere" aesthetic differentiation for their products. Think of watches for example. Most watches sold have the same essential functions. In truth the more techie watches, like with a calculator, or pedometer, or whatever, sell far less than the basic digital watch, or analog fashion watch. The major difference between all watches is aesthetic in most cases: fashion. Cell phones could be similar (and this is part of what Apple has done with their products :D). There's nothing saying that Nokia, Motorola, and others could not compete and sell just as well when relying on aesthetic differences vs. technological. In fact, underneath all the bluster about "Droid this" and "Droid that", the real differences between most models in a given functionality bracket (e.g. smart phone vs. "feature phone") are incredibly small most of the time. Maybe 200Mhz more CPU here or some more memory there. Rarely you will have the option of a slide-out keyboard with one model, or a front-facing camera. Front-facing cameras will become standard soon, and slide-out keyboards are functional but also influence aesthetics, and are really a binary decision/difference (have or have not). So really the feature differences are minimal and much is about aesthetics or ease of use/UI already.

Thus I agree we should have more standardization (let go of Touchwiz, Motoblur, and Sense UIs!), incorporate the best of 3rd party UIs into Android core, and go from there. Each manufacturer can put out one portrait, one landscape phone, with each having a variation of slide-out keyboard or no, and then lots of different aesthetic variations. Imagine, isntead of buying a case that has a pattern you like and attaches poorly to your device, instead you can buy the device with leopard print on it permanently. Hehe.

- Oshyan
1118
Oh yes, no doubt, Betanews is crap now. Fileforum not much better. I get all my content from both sites through RSS now so I very, very seldom visit the site. And 90% of the articles I just skim and mark read. Especially Joe Wilcox. :P

- Oshyan
1119
Living Room / Re: How to understand all the Intel chip types?
« Last post by JavaJones on November 15, 2010, 03:30 AM »
UGH. The Intel CPU naming system is pretty much the most f****d it's ever been. If it were just i3, i5, i7 in that order it'd be great. Theoretically it is, actually. But in reality Intel appears to contradict the rules of their own naming schemes constantly. There are i5s that are well faster than i7s, for example. Or dual core i7s when they were all supposed to be quad. It really must be deliberate, but I'm not entirely sure why. But then I'm not a multi-billion dollar company, so I guess I just wouldn't understand. I try though, I do want to understand why it makes economic sense, quite aside from my frustration about it not making *consumer* sense. I don't like that reality, but I feel like it'd be easier to stomach if I understood the business case for it. But inciting consumer confusion doesn't really seem like a solid business plan to me.

Another thing that bugs me. The whole "Ghz doesn't mean anything anymore". Yeah, sorry, that's kind of BS. It still means a lot. You just can't try to compare across CPU architectures. Fortunately there are only 2-4 (maybe 6 at max) different fundamental architectures at a given time, and most of the time there are only 2 in any given price/performance bracket that you'll be looking at anyway. Not coincidentally, these are almost always the different CPU architectures of the major CPU manufacturers in competition: Intel and AMD. So yes, Intel's Ghz is not equivalent to AMDs Ghz as far as *work done per Ghz*, but within the same family Ghz is a pretty clear and consistent measure of *performance*, with a few modifications.

For the most part all you need to do is multiply the speed of the CPU by the number of physical cores. That will give you a pretty darn good idea of actual performance. So an Intel i7 860 at 2.8Ghz with 4 physical cores could be thought of as a "4x2.8Ghz=11.2Ghz" CPU. Accurate in a technical sense? No, not at all. But it does communicate the *amount* of work it can do at once, relative to other chips with the same architecture.

Now 2 things can complicate this, but neither actually has such a huge impact on performance that it really throws off the scale enough to invalidat it. One is "hyperthreading", which effectively allows instructions to be submitted to additional "virtual" cores and does increase performance from 5-15% or so on average. Some i-series CPUs have it, some don't. Those that do are more performant, at least on multithreaded tasks. But again those CPUs with more actual threads and higher actual Ghz will be faster anyway. The other thing is "turboboost", the ability for the CPU to clock itself up under single threaded tasks. This again does not necessarily have a huge impact, though it depends on the user's app mix. Granted many things are still single threaded so A: faster single-core speeds will matter more than more cores, and B: CPUs with turboboost could actually make a difference. But fortunately as a general rule CPUs with turboboost are already higher on the performance scale and the straight Ghz measure should suffice. There are plenty of exceptions to this, but it's still worthwhile paying attention to Ghz, and the idea that processor clock speeds should be totally ignored is rather silly as it's still the main factor in performance.

As I said this doesn't work for comparison between processor architectures, particularly across Intel and AMD. For that you need actual benchmarks anyway because there's no way you'll ever figure it out just looking at spec sheets. For the most part all you need to know these days is that Intel's architectures are more efficient, so generally speaking a given Ghz on an Intel CPU will be more powerful than the same on an AMD CPU. The opposite used to be true back in the days of the original Athlon and P4s, but the tables have turned (and may turn again, but the rule of thumb will probably still be easily deduced).

So long story short, don't bother trying to understand every model # and feature. Just look at CPU speed (Ghz) and number of cores and multiply. That's *if* you do multithreaded work (any graphics app, more and more games, and many other apps, even web browsers are increasingly multithreaded). Otherwise the max speed of a single core is the most important figure. And, as I said, Intel is higher performance per Ghz.

Anyway wasn't the Windows Performance Index supposed to help with all this BS? Why don't we see those scores advertised with new PCs at this point? I always figured that was the eventual goal...

- Oshyan
1120
Living Room / Re: The Story of Stuff - Cosmetics, Bottled Water...
« Last post by JavaJones on November 15, 2010, 02:55 AM »
Wikipedia has some coverage of the cost/benefit analysis that is somewhat interesting (though obviously far from comprehensive):
http://en.wikipedia....ost-benefit_analysis
Here's a decent quote:
Economist Steven Landsburg has suggested that the sole benefit of reducing landfill space is trumped by the energy needed and resulting pollution from the recycling process.[27] Others, however, have calculated through life cycle assessment that producing recycled paper uses less energy and water than harvesting, pulping, processing, and transporting virgin trees.[28] When less recycled paper is used, additional energy is needed to create and maintain farmed forests until these forests are as self-sustainable as virgin forests.

Other studies have shown that recycling in itself is inefficient to perform the “decoupling” of economic development from the depletion of non-renewable raw materials that is necessary for sustainable development.[29] When global consumption of a natural resource grows by more than 1% per annum, its depletion is inevitable, and the best recycling can do is to delay it by a number of years. Nevertheless, if this decoupling can be achieved by other means, so that consumption of the resource is reduced below 1% per annum, then recycling becomes indispensable – indeed recycling rates above 80% are required for a significant slowdown of the resource depletion."

Some more interesting info (more on the pro-recycling front):
http://www.oberlin.e...u/recycle/facts.html

More:
http://environment.a.../benefit_vs_cost.htm

And a list of some opposing views (as in on both sides of the argument):
http://en.wikipedia...._Opposing_Viewpoints

Overall it looks like, if done right, recycling is generally a net win, especially if you account for all factors (within reason), including the pace of landfill use and the availability (or lack thereof) of space for new landfills.

I think the "recycling does more harm than good" is one of those deliciously counter-progressive memes that took hold based on a few factors, one being that yes the initial recycling efforts were less efficient than would be ideal, but another perhaps stronger factor being that it's simply convenient to believe that recycling isn't that effective and it's being foisted on us by "the man". Because hey, who wants to recycle anyway? It's a pain in the butt. Unfortunately, like everything in this world, the recycling efforts are ultimately only as good as the corporate contractor you have doing the work, and most local governments lack the ability - or at least the will to enforce - strict oversight necessary to ensure best practices and maximal effectiveness.

Ultimately Renegade is right that a fundamental shift in culture and perspective, away from consumerism, is necessary. Unfortunately American culture and economy and industry are deeply rooted in the practice of mass consumption and disposability. It will take a sea change to move beyond that...

None of this means it's not worth doing the right thing, doing your part, trying to make a difference!

- Oshyan
1121
Living Room / Re: On Wikipedia, Cultural Patrimony, and Historiography
« Last post by JavaJones on November 15, 2010, 02:37 AM »
What's really interesting about this is the following line "to challenge absolutist narratives of the past". *That* is a cool idea, and a cool interpretation of the value of the wiki system. The very recording of edits over time as part of Wikipedia's system is itself a challenge to singular views of at least historical events. That's really pretty profound...

- Oshyan
1122
Living Room / Re: Wikipedia Book Creator
« Last post by JavaJones on November 14, 2010, 03:07 AM »
Oo! But is this system available for use on our own private-hosted wikis? Please say yes!

- Oshyan
1123
What about DoubleTwist? Granted it does more than you might want as far as PC syncing, photo management, and whatnot, but it's a pretty decent (albeit simple) media player generally speaking, and it does do ratings on the 5 star scale. I just tested it and you can rate on the phone and sync back to the PC as well. :)

- Oshyan
1124
N.A.N.Y. 2010 / Re: NANY 2010 Release: Twigatelle
« Last post by JavaJones on November 14, 2010, 02:26 AM »
Aww Twiggy, I miss ya! :D

- Oshyan
1125
Thanks for letting me know MLO was finally going online. I've been waiting for this! :D No, really. I never got into MLO because I use so many different computers, and really need something that can remind me wherever I am. So this is good news for me. I understand it's less welcome to you, but I say you stick to the real concern: that the desktop app would go away. If that's your fear, contact the dev and make sure they know how much you like and prefer the desktop app, and that you just hope it doesn't go online-only. There's nothing wrong with having a web-based version or one that hooks into cloud storage *optionally* - for many (like me) it's a great feature, perhaps even a necessary one. As long as it doesn't hurt the core app's functionality, it's a good thing.

- Oshyan
Pages: prev1 ... 40 41 42 43 44 [45] 46 47 48 49 50 ... 106next