ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

For those with a CrashPlan...

<< < (3/15) > >>

f0dder:
Reading up on Backblaze, I'm not sure that's an option - the only restore option is download through a browser, not their client?!

Currently testing SpiderOak one. The UI is prettier than last time I tried it (but still custom stuff rather than system-native, *sigh*). It's slightly better memory-wise than Crashplan, but still weighing in at around 400 megabytes for the two running processes. And it's still ungodly slow - there's no network speed indicator in the application, but Windows task manager show between 2-3mbit/s. That's not terribly good usage of my 30mbit/s upstream.

Also, it seems that when you "Download" something from SpiderOak, if the files exist on the local machine, those will be copied rather than downloaded? My connection definitely can't do 300MB in a couple of seconds. This means I can't (easily) gauge how fast I'll be able to actually get my files in case of an emergency. Sigh.

wraith808:
Is Amazon's Glacier a proper backup destination, if it's your sole remote? I thought it was pretty slow + "expensive" to get stuff out of Glacier? One of the important things wrt. backups is testing your backup archives regularly, which doesn't seem to fit too well with Glacier's model...

-f0dder (August 27, 2017, 03:41 AM)
--- End quote ---

For me, it's part of that 3-2-1 thing- my offsite cold storage.

Jibz:
There are quite a few aspects to such software that factor in (at least for me). I know CrashPlan used quite a lot of memory (here it's sitting at ~500mb), but as long as it's not getting in my way I can live with that. The same goes for bandwidth and CPU usage, the most important thing really is that I do not notice it's running.

One thing I also want is some kind of "time machine" functionality. I do not want to risk a file being overwritten (either by accident or malware) and some simple backup tool simply duplicating that remotely. Ideally there should be some way to go back to how the file was a week or a month ago. Most of these tools seem to support this somehow.

Then there is the storage question. Many of them promise "unlimited" storage, but have some small print that allows them to weed out anyone who uses more than what they deem "regular use". This is one area where I like SpiderOak's approach -- they offer you a set amount of storage, and then don't have to care how many computers or NAS drives you attempt to backup. On the other hand their tiers are somewhat more expensive than the "unlimited" ones. Also, as mentioned, they sometimes run promotions where you can get an "unlimited" account from them as well, and that is where I got mine. I haven't backed up more than ~120gb currently so I have no idea at what point they might start to question your usage.

f0dder:
Yes, Jibz, versioning (and retaining versions sanely!) is a very important part of backups for me as well. I don't view a product without versioning as a backup product.

Being able to set bandwidth limits is nice, I don't want a backup application to mess with my internet usage - but utilizing less than 10% of my pipe and taking forever to back up stuff is not optimal, either. And I do worry about how fast SpiderOak is able to restore...

I'm having a hard time finding a backup product that has a client (with a feature set) that I like, and the online storage (speed + cost) definitely is a hard thing to get right as well  >:(  >:(  >:(

Deozaan:
The off-topic discussion about decentralized cloud storage has been moved to its own thread here.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version