Latest posts of: Shades -
Welcome Guest.   Make a donation to an author on the site July 28, 2015, 12:45:11 PM  *

Please login or register.
Or did you miss your validation email?

Login with username and password (forgot your password?)
Why not become a lifetime supporting member of the site with a one-time donation of any amount? Your donation entitles you to a ton of additional benefits, including access to exclusive discounts and downloads, the ability to enter monthly free software drawings, and a single non-expiring license key for all of our programs.

You must sign up here before you can post and access some areas of the site. Registration is totally free and confidential.
Learn about the microdonation system (DonationCredits).
  Forum Home Thread Marks Chat! Downloads Search Login Register  
  Show Posts
      View this member's profile 
      donate to someone Donate to this member 
Pages: [1] 2 3 4 5 6 ... 73 Next
1  Main Area and Open Discussion / General Software Discussion / Re: What's living in your taskbar notification area right now? on: July 27, 2015, 08:56:26 AM
Although I agree with eleman here, whenever an application requires to a (full-fledged) database to operate, it is a serious consideration to store the results of queries etc. into RAM. Using more RAM for the application means it will be much more responsive, because retrieving and storing in a database is "expensive". Not only time-wise (responsiveness), but you also introduce an extra factor for reliability, especially when you need to access databases off-site.

Nowadays computers have quite a lot of RAM onboard and usually decent connections to (off-site) databases. However, you cannot trust to have a decent connection 100% of the time, while the RAM is a much more stable resource. For a good (newbie) user-experience, the trend is to use the most stable resources at hand.   
2  Main Area and Open Discussion / Living Room / Re: [SOLVED] Boot problem/s on: July 26, 2015, 10:10:56 AM
The new kids on the block are not nearly as stable as good old MBR...that's for sure. Faster, yes. Less limited about partition sizes, yes. But not nearly as stable...and when the proverbial sh.t hits the fan, you have way(!) bigger problems. As MilesAhead already said: Don't expect software that worked flawlessly with MBR, to work with GPT in a similar fashion.   

Personally, I try to avoid 2+ TByte hard disks for as long as I can. You can get GPT on smaller disks if you want, but it is obligatory for disks of 2+ GByte in size. And given the very bad experiences I had with 1,5 TByte and 2 TByte hard disks (I have a total of 8 of those "doorstops", none of these lasted a year and were all brand new), I'll keep to 1 TByte disks.

Besides that those disks soldier on without problems whatsoever (anecdotal, I know), the size limit does require you to think about backup strategies that you have to execute.

The lure of 3+ TByte disks is certainly there, but common sense dictates that it is never a good idea to "keep all your eggs in one basket". 
3  Main Area and Open Discussion / Living Room / Re: [Help!] Boot problem/s on: July 25, 2015, 07:43:58 PM
Default setting for Windows 7 is to make a 100MByte partition (on a spinning hard disk, SSD hard disks use a 350MByte partition for that same purpose) when installing on a unpartitioned disk. But if the disk already contains partitions Windows will use these however it sees fit. Anyway, if your system has such a partition, it becomes essential in the Windows boot procedure.

Not sure what to make of the first partition in your list, as I can't be exactly sure how the boot procedure on your computer was configured.

This is taken from my own system. I added a 120GByte SSD hard disk to an already working PC (disk0). Afterwards I installed Windows fresh on the SSD and as you can see, there is a 350MByte partition (label: 'System Reserved'). This partition must be 'Active'. Depending on what software you use to check the partition structure of your disk, the terms 'Active' and 'Boot' are used to indicate the same thing. The Windows disk manager makes a distinction here, as you can see in the screenshot.

So I am not sure what to make of the 1st partition on your hard disk. Likely when you make the 3rd partition the boot partition your system will start working again.

4  Main Area and Open Discussion / Living Room / Re: [Help!] Boot problem/s on: July 23, 2015, 09:23:20 PM
Sounds like the order of the partitions got jumbled up in (parts of) the Windows configuration.

During the boot routine some hard-coded path's are used and it appears that there things go wrong.

Windows partitions are assigned a unique code that is later on translated into a drive letter. My suspicion is that here your problem starts. Suddenly all parts of the operating system are not in the location where the operating system expects them to be and a generic error code 0x3 will be served up to your screen.

Fixing this kind of errors might prove more time-consuming than re-installing or putting an image back. Laptop manufacturers have the nasty habit lately to put the recovery partition in front off the other partition. The reasoning behind this doesn't make a lick of sense to me. After all, this is the fastest part of the hard disk, which should be used for the Windows partition. After all, you will spend more time using the laptop instead of restoring the factory setup!

My guess is that they do this, so they can crank out laptops faster. Anyway, even the Windows installer gets "confused" by this on occasion.

What you could do...and this is a dangerous, possibly warranty voiding proposition:
  • Buy/borrow a USB hard disk enclosure
  • Pull the hard disk from the laptop, put it in the enclosure
  • Hook the enclosure up to a (Windows) PC
  • Install software to manage partitions on that PC
  • Start moving partitions on the disk (in the enclosure!!!), use this order: BOOT, OS, Dell recovery
  • That order of partitions never confuses!
If the above sounds like too much mumbo jumbo to you, get someone with the required skills to do it for you. 

That is what I would do, if you asked me...and I know you didn't.

Or put the image you already have back and see if your trust in Aomei Backerupper is warranted.
5  Main Area and Open Discussion / Living Room / Re: The end of the hard disk on: July 22, 2015, 07:00:38 AM
Hard disks can become too big to fail...
6 Software / Post New Requests Here / Re: Simple auto-saving text editor for mercurial commits on: July 21, 2015, 08:30:40 PM
Well, it is not a simple editor:

Perhaps this link suits you better:
7  Main Area and Open Discussion / General Software Discussion / Re: Windows 10 Announced on: July 17, 2015, 06:47:23 PM
Maybe Deozaan is all three these do not exclude each other.

But I guess I don't want to deal a masochistic lawyer with a (sick) sense of humor... tongue
8  Main Area and Open Discussion / Living Room / Re: UK to ban encrypted messenger services to combat the specter of ISIS on: July 15, 2015, 08:19:27 PM
People who really want <whatever>, will <whatever>.

Read somewhere that communication takes place on web accounts where someone writes a message and leaves it open without finishing it. The party for whom the message is intended logs into the same account, sees the open message and does whatever with it. No need for encryption and it will take major effort on the spying party to get to the information they seek. Especially when that is done on random intervals on different locations. No need for encryption and hiding in "plain sight", so to speak. 

And those UK parliament nitwits want access to encrypted traffic (customer-bank communication, communication between different energy companies and customers, etc)? Aren't there national lawyers in the UK to fight for prevention of (digital) identity theft by criminals and/or government? Note that I still make a distinction between them, although I am not so sure I should nowadays.
9  Main Area and Open Discussion / General Software Discussion / Re: Problems using a Windows 8.1 as a file-server via VPN connection on: July 14, 2015, 09:05:11 AM
You could consider something like: Owncloud (open source)

That will allow you to create users with passwords that can have access to separated and shared storage folders. Make that work locally on the Windows 8.1 machine. You can configure the storage folders to be anywhere on your local machine or network.

Now there are 2 ways to let your users connect to that Owncloud server:
1. Users still use the VPN software to connect to your network, then use a browser to connect to your local Owncloud server where they login and do their up- and downloading as necessary.
2. You make the Owncloud server publicly available on the internet. Users can then use a browser to connect to your Owncloud server where they login and do their up- and downloading as necessary.

Option 1 would be the easier/fastest/more secure option for you.
Option 2 would be the easier option for your users.

There are alternatives for the Owncloud software, if you don't like that particular piece of software.

Why the requirement of Windows 8.1?
There is a reason you should use Windows Server editions instead of normal Windows editions for these kind of jobs. Server editions are equipped to handle lots of concurrent network connections that allow your users to do their work properly. The limits of normal Windows versions in this aspect will become very clear, sooner than later.

Heck, even setting up a Linux server is likely going to give you less problems than the setup you propose.
10  Main Area and Open Discussion / General Software Discussion / Re: A double visualization on: July 14, 2015, 08:39:16 AM
As an Asus P5QL-EM motherboard (almost the same as yours) owner, I hardly ever used the onboard Intel graphics. I do like the Intel chipsets, these have proved to be very reliable in my experience. Except for the Intel graphics part. So I always used an ATI Radeon HD 4670 card (from Sapphire) with it. Excellent combination for as long as it lasted.

Lately I got an AMD Radeon HD 6450 (from XFX) to ride out what time is left on the motherboard. Here in Paraguay, you can get that card new for around 55 US dollars. So in the US they are even cheaper (I didn't look it up, but US prices usually hover around 70% of the Paraguay prices). Although the specs from that card appear to be good for the price, it isn't a good card for multi-monitor setups.

The drivers from AMD work fine with a single monitor. But they are problematic with dual monitor support on Windows 7 and a real nightmare with Windows 2012 (used their Catalyst 13.x, 14.x and 15.x range of drivers, including the beta's). Reporting my experiences to their tech support was met with utter silence from their end.

All I want to say is that a replacement video card won't break the bank (especially if you don't have a multi-monitor setup). You will like the extra options and power that even the "bottom of the barrel" cards from AMD or NVidia give to your old system and Intel graphics.   

11  Main Area and Open Discussion / General Software Discussion / Re: Question re archiving/retrieving directories and subdirs using Mercurial on: July 11, 2015, 05:52:25 PM
That trick you suggest, does work with a file-based repository. If your repository is stored in a database of some sorts, than all bets are off.

Verify if your setup uses a database or not, because I know from experience that SVN uses the database mode. Although Git looks like it uses file-based versioning, completely sure I am not. The same is true for Mercurial.

An example:
I personally have done this with CVS repositories. I copied the original CVS repository (all files and the .cvs folders!) from a Linux machine to a Windows machine, installed the WinNT CVS server software on the Windows PC, Then just pointed the server software to the copied repository and started the server software. That was it, all changes/comments (15 years worth!) were available.

To my understanding, you can change the mode from modern versioning software only during creation of the repository. File based repositories tend to take up more storage space than repositories that use a database. So if you have the space I would use a file based repository, for the easy backup/restore procedure.

12  Main Area and Open Discussion / Living Room / Re: Partitioning or Not w. single HDD? on: July 08, 2015, 08:29:41 PM
@Stoic Joker:
We will agree to disagree on this bit of computing  tongue

A partition scheme doesn't help with a drive stopping dead in its tracks. On that one we are in agreement. Only once I have encountered such a type of HD crash, but I must say that I hardly work with laptops and I assume those devices would encounter such crashes, because of how these are (mis-)treated.

Spinning hard disk tend to die slowly in desktops and servers, at least in my experience. And in those cases I found partitions to be very helpful with recovering/salvaging information. After all if everything was stored in one partition, chances of overwriting data to be restored with data that is currently being restored is too high.

Lets say I think of the NT file-system as a child. With clear borders it behaves well. Without those borders? Expect it will throw tantrums at your most inconvenient moment.

13  Main Area and Open Discussion / Living Room / Re: Partitioning or Not w. single HDD? on: July 08, 2015, 08:58:46 AM
There is another advantage with having everything divided into partitions. Especially with the use of portable apps.

Whenever I need/want to use a virtual machine, I just need to install one and make sure the partition with my programs is directly available to it. With VirtualBox you have such options, I assume VMware will be able to do the same.

This saves me a lot of storage space, om both the host and in the virtual machine. I can make use of the same software at the same time in both the VM and the host. No re-install and/or reconfiguration required. As that usually is the part that consumes the most time from an re-installation, I am glad to get rid of that.

Partitioning is a good thing in my book, especially when the tasks the computer needs to do don't change that much over the time you plan to use that computer. So that leaves the sizing of the partitions. Getting that right is important and easy to do wrong. The resizing of partitions, which is a hassle on itself, is also not without risk.

The partition sizes I mentioned earlier work perfectly for me, but there is no guarantee that these will work for you. Especially if you want to play games, you should increase the size of the software partition significantly. Modern computer games consume storage space like there is no tomorrow.

I haven't had a need to change the partition size of any partition on my own system in 6 years (besides partitioning the SSD I got recently).      
14  Main Area and Open Discussion / Living Room / Re: Partitioning or Not w. single HDD? on: July 07, 2015, 08:03:13 AM
The software is called Cameyo. Instructions on how to use it are here.

Virtualizing for your own personal use only? Go here.
15  Main Area and Open Discussion / Living Room / Re: Partitioning or Not w. single HDD? on: July 06, 2015, 09:17:15 PM
I always divide up any hard disk in at least 3 partitions, but my preference is 4 of those.

One partition for Windows (and literally nothing else).
One partition for software I install.
One partition for data.
One partition for temporary storage.

On the Windows partition I move every user-related folder to my data partition using the Windows options that are available to me.
On the Windows partition I move every temporary storage-folder (user and system) to the temporary storage partition using the Windows options that are available to me.
On the Windows partition I make the page file 8GByte in size and won't allow it to grow above that. On a normal PC I disable hibernation.
Whenever I install software, I select the software partition as it's home. And because I am the only user, I use portable versions of software I wherever I can. And if that is not an option, I have some software that allows me to make portable versions of software that requires installation normally.

Why do I go through all that trouble?
The Windows partition is hardly ever written to (besides updates). This reduces the need to defrag it drastically. And the Windows file system will after some time have placed the files in their optimal position on that partition. This way you can make the Windows partition lean and mean. And it will stay that way. I normally allocate about 25GByte for the Windows partition.

The same is true for the partition that contains my software, especially when you don't install new software on a regular basis. I usually allocate 100GByte for the software partition, which is  enough, because I make sure to configure applications to work with and store everything on the data partition. I must add that I hardly play any games anymore.

As a result the data partition and temporary storage partitions get a bit messy. I usually allocate 25GByte to the temporary storage partition, mainly because I don't think I need that much garbage on my system to begin with. The rest is for the data partition. I don't care too much about the content of the temp partition and wipe it clean at least once month. Saves me defrag session as well  Wink

That leaves the data partition, that one does need a defrag once in a while, but not nearly as often as you think. And even on that partition the Windows file system will find the optimum location for the content after some time.

Exaggerated? Perhaps, but in my mind all of the above makes perfect sense. There are many schools of thought regarding this subject and most are there on merit. Just couldn't help adding my  two cents

16  Main Area and Open Discussion / General Software Discussion / Re: Is robocopy faster than Windows GUI to "Move" files between drives on: July 05, 2015, 09:46:40 PM
I believe you can disable the "Calculating size" bit in Directory Opus.

From your code I understand that you want to move stuff from every subfolder in the %from% folder into one giant pile in the %to% folder.

A word to the wise:
The NT-filesystem doesn't like folders with huge amounts of files in them. If you have such a folder, you will notice that Windows will slow down to a least until it is finished processing the folder.  Doesn't matter how much "horse-power" your PC has, opening and/or working with a folder that contains several thousands of files, performance degrades significantly. And yes, I know that in theory the NT-file system supports loads and loads of files in whatever structure you can think of. In practice: better keep an eye on the amount of files you store in a folder.
I would try something like this:

[copy or print]

::User settings
SET /P srcfolder="      Enter source path:  "
SET /P dstfolder="Enter destination path:  "
::Folder check
IF NOT EXIST "%dstfolder%" MKDIR "%dstfolder%"

PUSHD "%srcfolder%
FOR /r %%a IN (*.*) DO (
  MOVE "%%a" "%dstfolder%\%%~nxa"
ECHO Folder error: %srcfolder%
17  Main Area and Open Discussion / General Software Discussion / Re: Looking for Software with this feature on: July 05, 2015, 10:17:26 AM
I'm wondering why file size matters, because wouldn't the date be off by even a few seconds if it's two different copies of a file? Even in a high speed automated "log.txt" or something updated and then aggressively backed up, do any of the options above change context if it doesn't need to know the file size (or maybe checksum, because for ex someone opens a text file and then say MS Word adds a line break it's now different.)

The OP refers to file "name, extension and size", but file size is generally an unreliable/imprecise basis for file comparison, whereas content (checksum) is pretty definitive as a data analysis tool.
You seem to have conflated "time" with "size", and yes, "time" is also an imprecise basis for file comparison - mainly because of the different and inconsistent time-stamping standards applied by different file operations and file management tools.

IainB is right about having a checksum from both files and comparing these to find out if the files are the same or not.

Unfortunately, it looks like xplorer2 uses CRC to generate these checksum values. The advantage of CRC checksums are that generating these is fast. Disadvantage is that these checksums are not always unique.

So these were replaced with MD5 hash values (which take a bit more time to generate) but nowadays these can also be tricked. Best option for now is to generate SHA-based hash values of files to identify these. But again, these take even longer to generate.

The method IainB suggests is the best method you can apply to identify if files are unique or not. CRC is better than nothing for this purpose, but not much. SHA is much better, but consumes a lot of computational resources, so if your system doesn't have much of that (readily) available...expect to wait long times.
18 Software / Find And Run Robot / Re: FARR hangs when %SPECIALSYS_DOCS% included in search folders on: July 04, 2015, 09:16:31 AM
Microsoft uses the guidelines for TCP/IP connections. That is, it will wait till 30 seconds (which feels a lot longer when you wait for something to happen on a computer) before it will give up on any network connection. This is part of a (sub-)set of parameters you can only change in the registry. MS doesn't have any window, field, picker, drop-down menu, check box or radio button for these parameters to be adjusted...on purpose.

The information is available on the MSDN/TechNet websites. Have found it there myself, but those settings can have serious implications, so I won't link to it either.

"Funny" thing is...if your application isn't multi-threaded and it tries to make a network connection it will wait 30 seconds before continuing, if the application needs to make another network connection, it will wait again for 30 seconds. This cycle is repeated every time for every network connection. The user gets the impression the application hangs, but in reality the application works like specified. It's the TCP/IP specifications that create the problem here.
19  Main Area and Open Discussion / General Software Discussion / Re: when Google fails..DC! Cut and paste text adding double carriage return option on: June 27, 2015, 07:45:54 PM
Is it not an option to collect first in a text file. Then, when collecting has finished, add the double carriage return (very easy with an hex-editor or text editor that has similar functionality). And when that is done, copy-paste the content from the text file into Outlook?

Although this seems more work, it is easier to automatize than working directly with MS Office applications. More reliable too, if you need to take different versions of Office into account.
20  Main Area and Open Discussion / Living Room / Re: The end of the hard disk on: June 24, 2015, 08:30:25 PM
Next time I will be more precise and call it "storage device"...that should make everyone happy. smiley
21  Main Area and Open Discussion / Living Room / Re: The end of the hard disk on: June 23, 2015, 11:09:17 PM
Around 1000USD a pop you have now decently sized hard disks that work according to SSD principles, but instead of using SATA, they use the PCI-Express lanes of your motherboard. If you think SSD's (or SSD's in RAID) are fast...these puppies run 4 to 5 times faster than SSD drives (at their top speed) in most usage scenarios. If you want really fast servers that have no problems shifting mountains of data around, SSD's are already old hat.
22  Main Area and Open Discussion / General Software Discussion / Re: recover an SQL .mdf file that is currently written to a bad sector on: June 23, 2015, 10:54:57 PM
@Stoic Joker:
It is indeed a dangerous proposition. If the database remains working, moving data around by shrinking may be the step that saved your database..or it will create more severe problems. I would try that last.

Now I must say that I am not too accustomed wit SQL Server, Oracle is what I know. But the lessons learned in my encounters with that fickle beast apply to SQL Server as well.

To create a full dump from your database, it shouldn't take much more than:

Formatted for SQL with the GeSHI Syntax Highlighter [copy or print]
  1. USE <insert DATABASE name>;
  2. GO
  4. TO DISK = 'Z:\Bak\SQLServer\<insert database name>.bak'
  5.    WITH FORMAT,
  6.      MEDIANAME = 'SQLServerBackup',
  7.      NAME = 'Full Backup of <insert database name>';
  8. GO

Creating another database server with SQL Server Express on a different computer should be easy. The defaults provided by the installer are sufficient for a database like yours. You will encounter problems restoring a database on the same server, so make sure you use a different server on a different computer!

Uploading the dump you created to the new server is also not difficult:
Formatted for SQL with the GeSHI Syntax Highlighter [copy or print]
  1. RESTORE DATABASE [NewDatabaseName]
  2.    FROM  DISK = N'Z:\Bak\SQLServer\<insert database name>.bak'
  3.    WITH  FILE = 1,  NOUNLOAD,  STATS = 10

This can also be done with the SQL Server Management Suite (SSMS for short), an option enabled by default in the installer. It was in the SQL Server 2012 software I used to setup my server, and that is the only SQL Server I have experience with. Even if the old DB server doesn't have it installed, you can install it on the new server and use SSMS to connect to the old server. If you know the passwords for the old server, you will be amazed how easy it is to export the old database and import it into the new database. SSMS is a very nice tool and easier to work with than what Oracle delivers with their server software. I can tell you that much.

SSMS (for SQL Server 2012 at least) comes with functionality to compare databases. The Oracle software also comes with such functionality and that software doesn't care if the compared databases are not on the same db server. I assume it is the same with SSMS. Even if that is not the case, there are 3rd party tools or scripts that will.

You could try the TOAD for SQL Server server (available in free/commercial versions). TOAD is much more powerful than the Oracle software and the TOAD for SQL Server software should be in the same league.

Dumping and restoring your database is the first option I would try. Mainly because that is the easiest (especially if there is a redundant SQL Server running in your environment).

Cloning the hard disk from the old server is the second option. But assuming that this server remained active during this thread, it should be first option by now.

After that I would try shrinking the database and more or less hope if that works out.

Next time your company negotiates for a new license from the company that delivers the software you work with, get the most angry person that works there and make him/her curse them to hell about holding your data hostage, especially in cases with imminent hardware failure. And seriously look for other vendors of similar software, and let the current vendors know your company is doing so.
23  Main Area and Open Discussion / General Software Discussion / Re: recover an SQL .mdf file that is currently written to a bad sector on: June 22, 2015, 09:51:01 PM
HDClone - that would be a tool to use. The freeware version is slow, but it does the job. It will create a bootable device (CD/DVD/Pen drive) and after you connect both hard disks, you only need to boot that system up and it will clone your hard disk. Might take a bit (no pun intended) on the problematic sector(s). It doesn't use Windows at all and that is a good thing in this kind of cases.

There are alternatives to HDClone if you so desire, but I can (and will) personally attest to the excellent qualities of HDClone.

Maybe Macrium is good(enough) at cloning, but as far as I know their software to be able to make images of a hard disk. And Google confirms.

An image is not a's close, but it isn't a clone.
24  Main Area and Open Discussion / General Software Discussion / Re: recover an SQL .mdf file that is currently written to a bad sector on: June 21, 2015, 08:42:50 PM
You are correct in that cloning is the most secure way to rescue your data. However, from questforfla's previous posts I gathered that the company he works for doesn't like to spent money on IT. With that in mind I made my comment.

You could also be right about this problem being the cause of earlier issues with their backup procedures. I missed that previous thread while lurking here.  embarassed

Anyway, questforfla should be thinking about running multiple SQL Express servers. One that is used as production server and another that synchronizes with the production server on regular intervals. When the production server is having a problem, the redundant server can take it's place, everyone can continue and he has time to properly fix whatever is the problem of the production server.

The main problem with such a setup is the license from the software they use for their business. Does their license tolerate the use of a redundant server or not.
25 Software / Post New Requests Here / Re: IDEA: Utility to handle files/folders with illegal characters on: June 21, 2015, 12:42:28 PM
If you still have a 32-bit Windows OS, then I would suggest to download and use PortableLinux. Because then you can run Windows and Linux (Ubuntu) both at the same time. Whatever file gives you problems in on one OS, might not be problematic in the other as Windows and Linux have a different set of illegal characters that can be used with file names.

Too bad PortableLinux only works with 32-bit versions of Windows...the software (co-linux) that allows Linux to run besides Windows is 32-bit only and doesn't look like it is going to be fixed any time soon. Having said that, I personally have tried and tested PortableLinux successfully with Windows 2000, Windows XP, Windows7 and Windows 8.

Pages: [1] 2 3 4 5 6 ... 73 Next | About Us Forum | Powered by SMF
[ Page time: 0.057s | Server load: 0.08 ]

Share on Facebook
submit to reddit