Welcome Guest.   Make a donation to an author on the site April 23, 2014, 02:51:30 PM  *

Please login or register.
Or did you miss your validation email?


Login with username and password (forgot your password?)
Why not become a lifetime supporting member of the site with a one-time donation of any amount? Your donation entitles you to a ton of additional benefits, including access to exclusive discounts and downloads, the ability to enter monthly free software drawings, and a single non-expiring license key for all of our programs.


You must sign up here before you can post and access some areas of the site. Registration is totally free and confidential.
 
Check out and download the GOE 2007 Freeware Challenge productivity tools.
   
  Forum Home Thread Marks Chat! Downloads Search Login Register  
  Show Posts
      View this member's profile 
      donate to someone Donate to this member 
Pages: [1] 2 3 4 5 6 ... 97 Next
1  Other Software / Developer's Corner / Re: Need help coding website with music and database on: March 26, 2014, 02:35:38 PM
Also, SoundCloud: https://soundcloud.com

- Oshyan
2  Main Area and Open Discussion / General Software Discussion / Re: Why 24-bit/192kHz music files make no sense - and may be bad for you! on: September 27, 2013, 07:10:32 PM
So we have 5-45,000hz earphones, great, that means you can do the blind testing I was referring to on your theory of sub-audibles being perceivable by other means (or subconsciously). That's been my point all along: it matters not a whit if you don't record the full range of sound unless we can demonstrate conclusively that a range beyond that which we currently record and reproduce is actually perceivable (consistently, reproducibly) to the listener. *That* would make a compelling case for expanding the range of audio recording, but that hasn't been demonstrated yet. This can easily be tested for modern equipment though. So the thing to do is get a set of those headphones and some sample Pono files and do some blind tests. Since you already have good headphones, I nominate you as our first test subject. cheesy

It's like selling 3D content that plays in a regular TV, but is way better in a 3D TV, that may never be produced.

"Way better" is highly debatable when, as I've pointed out multiple times, blind tests show that even MP3 vs. CD audio can seldom be differentiated. I would accept "but is potentially better in a 3D TV" as the comparative. But that doesn't sell it nearly as well, now does it? Wink And the fact that even this minor, incremental difference would only be noticed on a piece of hardware which may never exist... yeah, I'd rather not bother with Pono then and buy stuff over again in 10 years *if* there's a breakthrough.

It's not just that audio is "less profitable" to innovate in, it's also that it's a harder medium to push forward. Audio reproduction got a helluvalot closer to the limits of human perception than video did in a much shorter time. The gains that remain to be had are very incremental. We had nearly "perfect" audio reproduction in the home decades go, whereas for video, high definition has only become mainstream within the last 10 years, and even still it's far from "perfect", not only due to resolution and *color/brightness depth limitations*, but also due to lack of real 3D (with or without glasses), among other things. Audio doesn't suffer from the same limitations. The breakthroughs were easier to make and were made earlier (think multi-channel audio, for example).

My landline phone is just fine. Are you talking about landlines? Cell audio is crappy due both to legacy networks/tech, and the need to conserve bandwidth. There is some push forward toward "HD" call quality though, and I certainly welcome that. It *is* driven by market forces, so your argument there is sound. It's certainly not a fundamental technology limit. We could (and some people do) run Skype-like stuff over modern data networks and get better results.

- Oshyan
3  Main Area and Open Discussion / General Software Discussion / Re: Why 24-bit/192kHz music files make no sense - and may be bad for you! on: September 26, 2013, 01:53:30 AM
Actually, it's usually the professional/producer side that drives *media-based* tech innovation. This has been true of HD video, advances in audio, etc. I have no problem with 24/192 being used in the studio, or at least being available for those who want to use it. The natural progression is then for the speakers that can reproduce it to be developed for high-end studio purposes, then be bought/available for rich people who can afford it, then it ultimately becomes mass market and cheap enough for the average person to buy. That's *if* the technology actually catches on, and *if* it can be produced in a form that is not so delicate or subject to home environment variables that it doesn't work out. So basically I'm just saying that making Pono available now as a home listening technology is pointless and wasteful. By all means keep using it in studios, but let's wait until we can actually hear the difference, at which time great, a format is waiting in the wings.

So, no, the conclusions in the Xiph article are right on. It seems like we're actually in general agreement in terms of *right here and now* and *for the home user*. You just have a different idea of how the progression of technology works. I see little value in making content available without devices that can reproduce it. This is akin to selling 3D video *content* before you have even *invented* 3D TVs! The way it actually went was 3D TVs came out and there was very, very limited content, but their growing adoption drove content production. Think about it in the context of this debate...

- Oshyan
4  Main Area and Open Discussion / General Software Discussion / Re: Why 24-bit/192kHz music files make no sense - and may be bad for you! on: September 25, 2013, 12:39:08 AM
Hmm, how do you know the tests aren't testing for the right things? They are *not* as specific as you are suggesting. Here's how a blind audio test works: a person listens to playback of 2 (or 3) audio segments with identical *content*, but that differ in compression/bitrate/storage media/etc. They are not told which is which, but they know they are listening for a difference (sometimes they listen to 2, then a 3rd, and are supposed to identify which of the first 2 the 3rd corresponds to). If they can reliably detect a difference (or match the 3rd sample), they could correlate that difference with e.g. lossless formats vs. lossy compression. Multiple tests confirm that people are unable to make such distinctions with high enough bitrates in lossy compression (vs. CD audio 16/44.1 as a comparative). It doesn't matter one teeny tiny bit if the way they were able to detect a difference was because of "subsonic" or subconscious frequencies; they are not measuring the specific method of differentiation, only *whether there is any reliable differentiation*. There is not. Therefore the idea that they're not "measuring the right thing" is incorrect. They are measuring the *effect*, not a specific and limited set of criteria.

As I said I'm not aware of any such tests being yet performed on 24/192 audio, but since people are almost universally unable to detect a difference even between lossy and lossless 44.1 audio, I'm doubtful that the results would be any different. The only possible way they would is if you're right about the subconscious frequencies, which is highly speculative since speakers aren't built to reproduce such frequencies, and are broadly incapable of producing them even when intentionally induced to do so. The crux of my argument is focused on the limitations of audio reproduction equipment, not on whether such effects actually exist in the real world, with live sound (they obviously do). Still, I'd be curious to see the results of such a test, if only to answer your doubts.

- Oshyan
5  Main Area and Open Discussion / General Software Discussion / Re: Why 24-bit/192kHz music files make no sense - and may be bad for you! on: September 24, 2013, 01:03:29 PM
I think you're really just making an appeal to ignorance and elevating the value of the theoretical here, which could be the beginning of science perhaps (if it inspires investigation), but is really just speculation. It is in fact fairly easy to test the limits of what our sound reproduction equipment can produce, and that is ultimately all that actually matters in this consideration because in the end all the recording, mixing, and mastering has to get squeezed through those limited speakers/headphones on the listener's end.

But even if you somehow believe the measuring capabilities we have now can't account for every possible effect, as I said above there is really a simple way to find out if any of those "woo-woo" audio stuff is *practically detectable by humans* (whether directly or otherwise!), and yet so far such tests have failed to show a difference even between existing high quality (but lossy) audio formats and their lossless sources, much less a difference between two ultra high quality lossless sources. That being said I will say that to my knowledge no one has done such a blind test with 16/44.1 vs. 24/192 audio, so if indeed these inaudible frequencies are somehow reproduced by audio equipment, even though they're well outside their rated range, and if somehow humans are able to detect them, then there may be value in Pono and other ultra high quality audio storage approaches.

But I think the problem I have with your argument is that it essentially relies on the supposed limitations in our knowledge of audio science, when in fact, as I've pointed out, we don't need to know everything about audio to test *the effects* (to *understand* the effects we perhaps do, but not to *test whether they exist*). I don't think we need to wait until some possible future breakthrough in audio science to determine whether Pono is worthwhile. This is like someone saying "Homeopathic medicine works but our existing science has no way to measure it", to which I say do some controlled studies and we'll soon see. We can measure effects even if we cannot directly measure methods of action.

So who wants to run a blind test with Pono? I can guarantee you Neil Young won't be doing any fair comparisons (i.e. blind, same audio source, multiple subjects) any time soon. cheesy

- Oshyan
6  Main Area and Open Discussion / General Software Discussion / Re: Why 24-bit/192kHz music files make no sense - and may be bad for you! on: September 23, 2013, 04:01:36 PM
IANAAE (I Am Not An Audio Engineer), so take the following with a grain of salt. Wink

Frequencies that home speakers/headphones cannot reproduce are irrelevant without massive improvements in sound reproduction capability in the home. Given the fact that this has not happened in the last 50+ years (incremental improvements only), I don't expect it to happen in the next 50. Even if it did, the improvements would be so minor for most listeners that the cost and hassle of replacing all their equipment would not be desirable for most. By the time such a conversion was complete, it would be time to re-buy the White Album anyway, and it could be mastered in 512/4096 for all I care. In other words by the time you'll be able to actually hear the difference (due to limitations in your equipment, not necessarily your ears!), you would have likely bought the thing again anyways, i.e. there will be something better than "Pono". Buying Pono stuff now doesn't help you though.

For now and the foreseeable future, much as sub-audible frequencies may be *perceivable* and have an effect *in person*, they are not relevant for recorded music. Nor, in fact, are they relevant for *any* amplified music since there are multiple limits in place there, not least of which are the speakers, but also any live processing being done (reverb, compression, etc.). Even if your entire amplification system is analog, the speakers are still a limiting factor. As are mics that recorded it in the first place, for that matter! There is *so much compromise* throughout any music production process, whether analog or digital, that I think it's a bit silly to cling so tightly to the "purity" of reproducing the finished results with 100% accuracy. Hell, the placement of speakers in a person's room, or how old their headphones are (and thus how much wear they have been subject to, how clean and undamaged their drivers are) will likely impact the sound they perceive far more than the difference between 16 and 24 bit or 192kHz vs. 44.1kHz.

But forget all that, this is what really matters, and where real science comes into it (not the theoretical, the practical!). Multiple blind tests have been done that show that even so-called audiophiles, even self-processed "super hearers", cannot in fact hear the difference between high bitrate MP3s/AACs and original CD recordings. If that's true, how can we expect to hear difference in the even smaller (relatively speaking) quality differential between CD quality and Pono? Now you can argue theoreticals all you want, but in the end there is one great way to answer this compellingly, and that is to run blind tests with Pono, with 24/192 audio vs. 16/44.1, and let's just see what the results are. This reminds me of the Randi Foundation's million dollar prize for proof of the supernatural - so far nobody has won. cheesy

Until that happens, as far as I can see at this point you're going to be buying files in a proprietary format that are 6 times larger than they need to be, using more bandwidth and hard drive space than necessary (and probably paying more for the privilege too). It's wasteful and unnecessary.

Of course Xiph.org has done a far better job than me of explaining why all of this is misguided. cheesy

By the way Joe: http://www.youtube.com/watch?v=NGJ9Z0wOGYk

- Oshyan
7  Other Software / Developer's Corner / Re: The 2013 Game Developer Gender Wage Gap on: April 05, 2013, 03:13:15 AM
I have been sent the digital subscription for free for the past, like, 6 years, even though I haven't been in the games industry in more than 10 years. There is some more detail in the actual article, but not really the kind of detail you're asking for. I've attached a screenshot that gives an idea of their sample size. It doesn't necessarily inspire tons of confidence in accuracy, but at least they threw out some outliers (particularly high salaries, etc.).

[attach=1]

- Oshyan
8  Special User Sections / Site/Forum Features / Re: Testing waters for radical dc ideas - feedback wanted on: December 21, 2012, 07:41:46 PM
I feel like the single biggest thing that needs to be done is obvious and should really be tried *independent of other muckings* before calling it time for more sweeping change, and that is to finally properly reorganize and CMS-ify the main site, making it a better place for software listings, updates, etc. as well as other content. Until that is done it seems silly to me to talk of totally redoing the forums, which are pretty much working fine in my opinion (and apparently other people here agree). So why not make this an opportunity for the big push finally needed to go CMS on this place? After all, if we can't even successfully make *that* happen, what hope is there of successful transition to some radically new paradigm?

- Oshyan
9  Main Area and Open Discussion / Living Room / Re: Newzbin2 closes its doors on: November 30, 2012, 03:35:51 PM
Yeah, I'm sort of confused how Newzbin2 had subscription fees *and* had to shut down, but Binsearch.info is free and has been doing fine for ages...

- Oshyan
10  Main Area and Open Discussion / General Software Discussion / Re: Help me choose an online backup service on: November 29, 2012, 11:09:05 AM
Duplicati looks very interesting, especially in conjunction with Amazon's new(er) Glacier long-term storage service that is relatively cheap. For my 2TB data set, it looks like I'd pay about $30/mo for Glacier, far more than I pay for CrashPlan, but still worth it. It would be awesome if Duplicati could somehow support the CrashPan back-end, but it being proprietary (CrashPlan I mean), that seems unlikely, hehe.

I'm also curious about the cross-platform nature of Duplicati. It says it's primarily programmed in C# and .NET. It also sounds like the dev reimplemented e.g. Rsync and some other things, but I get the impression the Duplicity Python back-end is still being used in at least some way (which would explain part of its cross platform capability). I suppose my question is how the GUI is handled on each platform. I ask because I'm inherently a bit wary of interpreted languages due to overhead and inefficiency, and both Python and Java that CrashPlan is programmed in have this potential issue. That being said it's a much easier way to get cross platform code...

Unfortunately I don't have an easy way to test Duplicati on my full backup data set, but I'm quite curious how it would compare to CrashPlan in terms of memory use. I'll see if I can get a chance to test it at some point.

- Oshyan
11  Main Area and Open Discussion / General Software Discussion / Re: Seeking experiences from people backing up relatively large personal data sets on: November 28, 2012, 02:32:13 PM
As far as I know almost everything you just said about memory limits in modern Windows is wrong. huh

I *routinely* run applications using more than 4GB of RAM (nevermind 2GB). I do *not* have the "/3gb switch" enabled (that was for old 32 bit windows). 64 bit applications are not the ones that need "/largeaddressaware" compile flag, it's for 32 bit apps that want to access more than 2GB (but no more than 4GB). On 32 bit Windows OSs, 32 bit apps compiled with /largeaddressaware can use up to 3GB of memory *when the /3gb switch is enabled*. On 64 bit Windows OSs, 32 bit apps using /largeaddressaware can use up to 4GB. 64 bit apps can use a huge amount more memory than 4GB.
http://blogs.msdn.com/b/o...ve/2005/06/01/423817.aspx

Edit: I see 4wd essentially beat me to it. cheesy

- Oshyan
12  DonationCoder.com Software / Screenshot Captor / Re: Has the time arrived for the 'Animated' Screenshots? on: November 28, 2012, 02:24:35 PM
+1 for being able to capture simple animations and output to animated GIF.

- Oshyan
13  Main Area and Open Discussion / Living Room / Re: In search of ... opinions on RAID at home on: November 27, 2012, 11:51:25 AM
Quote
Just compare RAID to any other backup tool (online cloud, "realtime" file synchronization, etc.), and imagine a case where you are dealing with critical data being generated/processed at a high volume.
Which is exactly what makes it suitable and potentially necessary for enterprise environments and *not* for home users. How many home users do you know of that fit that criteria, "data being generated/processed at high volume"? I am one of the most demanding computer users I know and even I don't think RAID is worthwhile on my system, and I spent more than $3000 for it, so easily could have afforded it. If by "data" you're talking about lots of small files (e.g. you're a coder), then I'd still advocate a software solution in that case, because you can use a realtime local versioning system (*not* a DVCS), which accomplishes the same goal *and* improves your work by providing back versions.

Regarding backup "performance", yes CrashPlan uses a lot of memory on my system, but then it's also doing a lot more than a RAID solution would be (encryption, non-local backup, deduplication). If I turn off some of those advanced features, it's reasonable to think memory use will come down. Certainly there are lighter-weight backup (or, perhaps better yet, sync) solutions that are a more direct comparative to RAID, and while yes they inevitably have a greater performance impact than RAID, in practice it can and should be minimal. Even CrashPlan doesn't use much CPU at all, despite its high memory use.

Besides, would you not agree that RAID is *not* backup, and you'll need to be running backup software *anyway*?

Quote
Sometimes the greater the level of protection afforded, the greater the problems created should it ever fail.

Or as a friend of mine once put it: Slay one monster and it's only a matter of time before a bigger monster take its place.
(and other stuff 40hz said)
Yes, exactly. As I said, it's adding complexity, which I think most of us can agree is generally a bad word for the home user. Sure if things are working as expected, it provides benefit, but the moment something goes wrong, even the "planned for" disk failure, it starts to diverge significantly from the simplicity of the average data restore scenario. I suppose being able to replace (install) a failed hard drive should be a prerequisite for running a RAID, at the very least. But this is not necessarily as dead-simple as the average computer hardware jockey might think. If your RAID is not external, then you'd better hope you have your drives well labelled internally, because of course they're all identical. Sure, you can try to match the BIOS or Windows-recognized SATA port with the failing drive, but it's not necessarily trivial. And that can become an issue even in the "expected" failure scenario, nevermind the loss of the controller as others mentioned, or - god forbid - multi-drive corruption.

Quote
Rebuilding a RAID array does degrade performance to a mildly noticeable degree...but restoring from backup - especially an image backup - completely annihilates it. Progress and performance are both exactly zero as you sit about twiddling you thumbs waiting a few hours to get on with your life. And that's only after you get back from the hard drive store (with the replacement) which will hopefully still be open at whatever ungodly hour the thing decides to go poof at.
Unless you keep a spare drive around in the RAID scenario, you'll run in degraded mode until you replace the drive, which is riskier. You could keep a spare drive around for recovery in both scenarios. Also, RAID on the boot volume? Another complication. I was sort of assuming we're RAIDing our critical data store, and thus "full system image" backup isn't necessary. Use a simple sync "backup", your backup drive is then a 1:1 copy of your data, and you can just flip it over to primary if your main data store disk fails. In other words, the issues you point out - if they are even really issues for the home user - can be mostly dealt with using simple software and configuration strategies. That being said, I would still contend that downtime concerns of that significance are really fairly exclusive to enterprise use. After all, what home user can't just go out for a movie while their backup restores?

So, basically, RAID1 for supplementary backup purposes only, and to aid speed of recovery, *if* it's worth the cost and potential hassle to you. But frankly I just feel like it's a slippery slope to ever recommend RAID to any "home" user. The people barney is talking about sound more like IT professionals and potentially have the knowledge to deal with anything that would go wrong, so it's a lot more reasonable for them to make the (informed) choice to use it at home.

- Oshyan
14  Main Area and Open Discussion / Living Room / Re: In search of ... opinions on RAID at home on: November 25, 2012, 08:52:41 PM
Sounds good. Tell us what they say! cheesy

- Oshyan
15  Main Area and Open Discussion / Living Room / Re: In search of ... opinions on RAID at home on: November 25, 2012, 07:17:11 PM
A reasonably good backup is better than RAID because RAID only provides 1 potential advantage, and then only when it's working well: speed of "recovery". In *theory* you can quickly and "seamlessly" recover from loss of a single drive in a RAID array by simply replacing the drive that goes bad, and you don't lose any data. With most RAID solutions there is "rebuild" time during which there will be "degraded" performance, but at least your data is there. The problems with those are several.

First of all, the chances of 1 drive in your array failing go up the more drives you have. So even though adding more drives theoretically gives you more redundancy of that data, it also raises the chances that any one component in your redundant array will experience some kind of problem. Basically, it adds complexity, and that's generally not a good thing for "home" use. Managing RAID, while simpler than it used to be, also requires more technical savvy than simple backup, again it's complexity.

Second, in order to best handle a RAID drive failure, you should keep a spare drive around to swap in. This adds to cost of the solution. Cost and complexity are both factors that tend to count more negatively in a home environment than an enterprise one, and are major reasons why RAID is generally not advisable for home use, but may be perfectly useful for business use - businesses have a higher need for consistent uptime and are willing to bear the cost to maintain that. Uptime requirements in the home are generally much less significant and of lower priority.

Third, not all RAID failure is of the expected or easy to handle variety. What if it's not your drives but your RAID controller that fails? Well, if you've done it properly with hardware RAID, you need a new RAID controller which may not be cheap, probably as much or more than replacing a hard drive, and you aren't likely to be keeping one around as a spare like you would with the drive. Also, better make sure it's the exact same controller model or it might not recognize the existing RAID array.

In the end, you need a backup of the RAID anyway, and the only thing the RAID gets you that backup doesn't is theoretical speed of recovery. But that's only if you're willing to spend the money to do it right and have a spare drive around. So you have to ask yourself, is recovering my data super quickly really that important?

The other thing is the data recovery problem is potentially easily solved with a regular backup system, assuming we're talking about simple drive failure scenarios. You just setup a frequent sync to a second drive in the system (not a RAID1, although that could be done, but is generally overkill), then if your main data drive fails, you just switch over to the 2nd. Problem solved. Or, in my case, I backup to an external drive connected by USB3, I get internal-like speeds but the unit is portable, so A: it has its own power supply and may not fry even if my computer does (e.g. my PSU does in my tower), and B: if my tower does die, I can just plug my backup drive into another computer and have my data available immediately. RAID doesn't accomplish any of that.

Bottom line, RAID adds expense and complexity that is rarely justified in a home environment. I speak from some amount of experience here, I had an external RAID unit holding a ton of my data and it died on me and I had to pay a bunch of money for data recovery. RAID is not backup, and it also doesn't always accomplish the "quick recovery and graceful failure" it promises, either.

- Oshyan
16  Main Area and Open Discussion / General Software Discussion / Re: Seeking experiences from people backing up relatively large personal data sets on: November 25, 2012, 06:36:29 PM
I'm running a 64 bit version of Windows 7. Provided the application I'm running is 64 bit, it can allocate as much memory as I have available. I don't recall whether I'm running 64 bit Java (CrashPlan is programmed in Java, unfortunately), nor whether CrashPlan itself would need to be specifically programmed to take advantage of 64 bit memory space or if simply running in 64 bit Java would do the trick (I'm guessing the latter). But in any case my memory limit in the config files is 2048MB, and it's not going over that.

- Oshyan
17  Main Area and Open Discussion / General Software Discussion / Re: I need some help configuring Adobe Photoshop Lightroom 4 on: November 23, 2012, 02:15:46 AM
Hmm, odd that it freezes typing into text fields. I would definitely try to do some performance profiling while testing that specific issue. The A/V could easily be causing it, or some other active monitoring process (backup, search indexing, etc, etc.), but only by monitoring CPU + file accesses are probably going to be able to tell. Also far as I know the meta data changes aren't immediately written to disk, so it seems odd it would slow things down *right as you're typing*. Is there any global text-related software installed/active, such as a system-wide spell checker, text expander, multi-language switcher, etc?

- Oshyan
18  Main Area and Open Discussion / General Software Discussion / Re: Seeking experiences from people backing up relatively large personal data sets on: November 23, 2012, 02:12:37 AM
Thanks Renegade. Unfortunately with that little data (relatively speaking), it's not a direct comparison. I also have a very beefy machine, actually a bit beefier than yours. Wink And have 16GB of RAM. Most of the time I can "spare" 2GB for my backup process, it just seems like I shouldn't have to. However...

Jibz, I appreciate the angle you took, and it was something I was thinking about as well but didn't really know how to quantify. From your "back of the napkin" calculations indeed the memory use could be justifiable for deduplication. I'm kind of tempted to disable that if I can and see what happens. I do have 2 separate backup sets, 1 for photos (like you, though I haven't changed its frequency of backup, and maybe I should), and one for everything else. The photos are by far the largest backup set, about 2/3s of the data.

So, I'll try to tweak a few things, but would still love to hear some feedback from others with similar backup needs/scenarios, especially anyone using one of the other "unlimited" online backup services with 1+TB of data, e.g. Carbonite, Backblaze, etc.

- Oshyan
19  News and Reviews / Mini-Reviews by Members / Re: Not-so-mini review of CrashPlan backup software on: November 21, 2012, 07:23:38 PM
Sooo, I started running into a persistent crashing issue with, er, CrashPlan (hah!). Upon contacting support, they indicated it seemed to be running out of memory and that I should up the amount allowed for the service process. OK. Default is 512MB. Increased to 768MB. Nope. 1024MB. No... 2048MB? Yes, it works! And oddly, for a few days, seems to be using little more than the 500MB it was originally allowed, even though a 768MB allowance did not let it run. But wait... a few more days later and the service process is now using 1.5GB!? Oh damn. A week later, ~2GB. Well crap. Response from their support is "You have a complex/big backup set, CrashPlan is doing a lot of work, it's going to take a lot of memory." OK, I say, "Do you think this is "normal" for anyone backing up this much data in general, or is this just CrashPlan?" Unsurprisingly, their answer is "I can't comment on other software." So now I'm really wondering if this is just what I have to put up with because of my "big data" needs (I hope and think not!). Hence my new thread, hehe: http://www.donationcoder....rum/index.php?topic=32951

- Oshyan
20  Main Area and Open Discussion / General Software Discussion / Seeking experiences from people backing up relatively large personal data sets on: November 21, 2012, 07:23:00 PM
I am currently dealing with some issues with CrashPlan, the combined online and local backup service I reviewed and selected last year for my personal backup needs: http://www.donationcoder....m/index.php?topic=26224.0

One of the problems I am seeing is really high memory use, 1.5-2GB for the backup process (running as a service) at peak. It starts out lower but climbs over the course of a day or two to about that level, then hangs there, presumably as a result of performing more complex operations on the large data set, e.g. encryption, deduplication, versioning, etc.

Now until recently I've been reasonably happy with CrashPlan, but my confidence has definitely been shaken lately. I'm not seeking actual recommendations for other options just yet, but I'm starting the research process. A big part of that is trying to determine whether what I am experiencing is anywhere close to normal *considering my data backup needs*. It may simply be that I'm asking too much of the system and need to get more reasonable, hehe. So what I would love is to hear from other people who are doing fairly large backups to *online* systems, ideally with the following features/characteristics (or close to):

  • Data set at least 1TB, preferably around 2TB (my full data set is 1.9TB at present)
  • Number of files at least 1 million, ideally 1.5 million or more (I have 1.5 million files backed up at present)
  • Combined local and online backup (online backup is an important component; if you're only doing local, your info may be valuable, but it makes it not a direct comparison with CrashPlan)
  • Encryption (being done locally)
  • Deduplication being done on the backup set(s)
  • Continuous backup/file system monitoring (this is not a critical requirement as I do not absolutely need the feature, but this is the way CrashPlan runs, so it would make it most directly comparable
  • File versioning

The info I'm looking for is 1: What software are you using, 2: How often/on what schedule does it run, 3: How much data are you backing up, both in terms of number of files, and total size, 4: How much memory does the process (or processes) use at peak and on average, 5: How much CPU does the backup process use when actively backing up.

Hearing from other CrashPlan users with similar circumstances to myself would certainly be useful. It's very possible that the combination of data size, number of files, and features such as deduplication and file versioning simply make such high memory use somewhat inevitable (or a much slower backup by paging out to disk a lot more). If so, then it's time for me to think about getting rid of some features like possibly versioning (or try reducing length of version history perhaps). But I won't know until I can get some reference points as to whether this seems normal under the circumstances. Trying a bunch of different backup systems myself seems somewhat unfeasible as most would make me pay for uploading more than a fraction of my data, and online backup is a critical component of this.

Any info you can provide on your experiences would be great. Thanks!

- Oshyan
21  Main Area and Open Discussion / General Software Discussion / Re: I need some help configuring Adobe Photoshop Lightroom 4 on: November 21, 2012, 06:53:55 PM
Can you give examples of what text fields (or other actions) trigger the problem? Does it seem at all related to accessing the Windows file system, or is it far more Lightroom-specific, i.e. typing in pretty much any dialog (e.g. adding a keyword to a file) causes it? Are there any other delays or slowdowns, both in Lightroom, and other apps? I use LR4 regularly myself and, though there are occasional issues, it seems at least as fast as LR3 for me in most cases, and for some things it's faster.

- Oshyan
22  Main Area and Open Discussion / General Software Discussion / Re: Help me choose an online backup service on: October 31, 2012, 01:43:43 PM
What you say about their forum was not entirely the experience I had when I last checked/used it, but that was more than a year ago. My experience differed in the fact that I saw plenty of complaint threads, so it did not seem so much like the forum was "sanitized", and there was a reasonably level of response from support there, but still too many unanswered questions, particularly more obscure issues. In other words, I found that it was not a censorship issue (though that may be the case now, if they've decided to clean things up), rather they answered easy questions (um, yay?), but sometimes left harder ones unanswered, which is really the opposite of what's needed in a way; the easier questions should be covered by FAQ usually...

Anyway, glad to hear you got a quick response. Overall I was happy with their support and my direct contact with them, I think their forum support just may not be the best contact method.

Also, that review is pretty much worthless. The guy mentions no real, actual problems besides a default he doesn't like/agree with for what to backup (a default that I personally think is quite reasonable), and that the upload bandwidth being used wasn't ideal. But then he writes snarky and totally non-applicable responses to the legitimately helpful CrashPlan support rep that comments on his review. The rep said nothing promotional, they merely explained the choice of default and a possible reason for the slow upload, both of which could address his only reported issues. He fails to comment on the efficacy of the recommended fixes, spending several paragraphs simply saying how dumb the default is and how promotional the obviously not promotional CrashPlan post is. His responses to other comments are also terrible. Meanwhile his review title/headline implies that CrashPlan actually *crashed*, which he does not report at all in the text of the "review", again only citing what are essentially *defaults he doesn't agree with*. Whoopty fricking do. Sorry mouser, I'm a bit surprised you actually linked to that one. Wink

- Oshyan
23  Main Area and Open Discussion / General Software Discussion / Re: Help me choose an online backup service on: October 30, 2012, 09:05:21 PM
I've been backing up my 2+TB of data online for more than a year now, so it's certainly possible. It wouldn't be without the "seeding" option though, which is why CrashPlan is one of the few workable services for my needs. Once the initial large data volume is seeded, even though I generate a lot of data regularly (5-25GB/wk, mostly photos), I can keep up just fine.

However I'm fortunate to have a broadband service with no data cap, and I've chosen it carefully in part for that reason. I know not everyone has the option, but sometimes if you research a bit there is indeed a possibility. A great example is Comcast regular consumer service is capped at 250GB, but the business class service is *not*. There is of course a price difference, but it's not near as bad as you might think.

- Oshyan
24  Main Area and Open Discussion / General Software Discussion / Re: Help me choose an online backup service on: October 30, 2012, 07:49:26 PM
I too continue to use CrashPlan, but remain frustrated with its high memory use and some other issues. The compelling factors in my case are different than mouser's. Unlike him I am in fact dealing with *lots* of data, over 2TB at this point. So I clearly need an unlimited service. That eliminates a number of options off the bat, and makes many others cost-prohibitive. This large data set causes 2 additional problems that further limit the field of options. First, in order to successfully backup 2TB of data "online", you need to either spend literally months uploading at a theoretical maximum speed (which as mouser points out, and we all know besides, is never realized in practice), or you need to have a physical drive sent to you to "seed" the backup. The latter option dramatically speeds the process and is essentially critical when dealing with more than 100-200GB of data, let alone 2TB+. On the other end of that issue, with *restore*, you likewise need a company that provides the service of sending you a recovery drive in the event of a failure, because who wants to be *downloading* 2TB of data to restore? So again this significantly limits my options. Thus any recommendation I could ever make about any service - CrashPlan or otherwise - must take these constraints into account and is therefore based on my particular needs which I grant are not necessarily common to many others.

Bottom line: if you have "big data", CrashPlan may be one of the few viable options, though it is far from ideal unfortunately. Personally I would hope to see truly native client versions of their backup engine in the future, with Qt-based cross-platform UI. This could accomplish similar cross-platform coherence while achieving much lower memory use (I believe) and higher efficiency.

P.S. Dunno if it has already been referenced or not, but here's a Wikipedia table of online backup options which can potentially narrow your options quickly when you're researching: http://en.wikipedia.org/w...of_online_backup_services

- Oshyan
25  Main Area and Open Discussion / General Software Discussion / Another experiment in free software profit models from Bryan Lunduke on: July 06, 2012, 01:16:04 AM
A subject of ongoing interest here (for obvious reasons) is how developers can make money from free or practically free software. Bryan Lunduke has previously had some interesting things to say on this subject, his name has come up here before in related discussions, and he's been experimenting with various approaches for a while now.

Now he's trying a new approach to funding his software development efforts, which he lays out in this blog update: free source code (under the GPL), but only those that donate get compiled binaries. This has of course been thought of and tried before, but I suspect we'll learn more about how it all works out from following his updates from here on as he has tended to be pretty transparent about things. He's got a few other donator benefits thrown in to the mix as well, and I think he's got a reasonable chance of moderate success overall. But is that just due to his existing notoriety as a speaker and FOSS advocate, and building off the established name of his software company? Is this a model that new software devs have a chance with? It remains to be seen if even he will make it work, but I'm hopeful.

I imagine there will also be those who disagree with the idea, perhaps on the grounds that it's against the FOSS ethos, but it's interesting to note that this is coming from a pretty vocal FOSS advocate.

- Oshyan
Pages: [1] 2 3 4 5 6 ... 97 Next
DonationCoder.com | About Us
DonationCoder.com Forum | Powered by SMF
[ Page time: 0.05s | Server load: 0.05 ]