topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Saturday December 20, 2025, 6:40 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Recent Posts

Pages: prev1 ... 99 100 101 102 103 [104] 105 106 107 108 109 ... 364next
2576
Living Room / Google does no evil; kills reMail
« Last post by f0dder on February 19, 2010, 12:08 PM »
Yup, sensationalist headline.

From slashdot:
Hugh Pickens writes "PC World reports that Google has acquired a popular iPhone application called reMail that provides 'lightning fast' full-text search of your Gmail and IMAP e-mail accounts. The app downloads copies of all your e-mail which can then be searched with various Boolean options. reMail has only been in the application store for about six months — with a free version limited to one Gmail account and a premium version which can connect to multiple accounts. 'Google and reMail have decided to discontinue reMail's iPhone application, and we have removed it from the App Store,' writes company founder Gabor Cselle, who will be returning to Google as a Product Manager on the Gmail team.

While I do believe it's a bit too early to jump to conclusions, this certainly smells fishy.
2577
OpenOffice is slow, has been less stable for me than MS Office (in particular it sometime locks up when used in conjunction with ClipX, but can fortunately usually be tricked out of the lock-up - sometimes it plain crashes, which doesn't seem to be because of ClipX), it's macro editor environment sucks compared to MS Office, has worse documentation (article quality, cross references), and ODF sucks pretty much about as much as OOXML (whoever thought non-binary formats were a good default fileformat should be castrated). Also, it's a joke wrt. page layouting compared to MSO.

I still use OOo instead of MSO, though, because it's gratis. And I believe it makes sense for the government to use an open fileformat (which I don't really consider OOXML to be) for interchange... I just which OOo and ODF wouldn't suck so badly.
2578
General Software Discussion / Re: How to encrypt a USB drive without admin rights?
« Last post by f0dder on February 19, 2010, 08:43 AM »
I have a Man Crush on f0dder because of his knowledge of all things secret.  8)


I think Safehouse Explorer from Safehouse software works without Admin rights.  A bit more limited (you cannot edit files from the container, you have to extract them on the Desktop for instance) but it works.  I like their products a lot.  Oh, and it's free.  The full Safehouse suite is payware.
Which means it'll leave residue behind - might as well go for AxCrypt - free + opensource.
2579
Living Room / Re: Antivirus companies support virus writers?
« Last post by f0dder on February 19, 2010, 08:41 AM »
If by a chance antivirus flags some new malware in development as malware - the chance exists for any new software - well, I suppose the author will simply swap a few functions around, fiddle with compiler's optimization options, maybe screw a little with UPX source code or not use UPX, and it'll pass.
That will stop pattern-based and code analysis heuristics (stuff that analysis before the malware runs), but it won't stop HIPS functionality that looks at the actions running code performs. As long as a new nasty privilege escalation bug isn't discovered, a decent HIPS will be able to block the malware. I don't know if there's any decent HIPS around, though, since I haven't been running anti-malware stuff for years :)

What do regular users really need antivirus software for is software piracy. Software piracy is not practical without having a good antivirus.
I disagree - drive-by exploits are a reality, and it's not like you're likely to get infected by piracy... as long as you have better sources than google searches.
2580
Living Room / Re: Anyone playing Mass Effect 2 game yet?
« Last post by f0dder on February 19, 2010, 08:33 AM »
Wraith808, playing the second game before the first in this case seems like a bad idea as you apparently get a more engaging story by continuing your current arch from the first game into the second using a save import feature
Great - I'm pretty sure I nuked the savegames.
2581
General Software Discussion / Re: Paragon Virtualization Manager 2010
« Last post by f0dder on February 19, 2010, 07:10 AM »
an exciting read, and great news. fingers crossed that no complications appear later - or none that will make the v2p process completely unusable.
Yeah, me too :)

I think that any potential problems would manifest very quickly through BSODs, the most likely one being INACCESSIBLE_BOOT_DEVICE - but that's what the "p2p adjust" is for. The WinPE disc even has an option to fix the machine SID, but apparently that's irrelevant. I think the road to success involves not installing other drivers (going to try things like MagicDisc and a ramdrive, though!) and not activating windows in the vm (you have a 30-day grace period while doing the install, so why bother? :)).
2582
General Software Discussion / Re: Paragon Virtualization Manager 2010
« Last post by f0dder on February 19, 2010, 06:16 AM »
Eureka!

Did a p2v test last night, transferring my current workstation to vmware. Took a bit getting it to work, but that wasn't PVM2010's fault - I attribute it to the fact I was doing the operation at 3am. I forgot to include the Win7 Boot Config Data partition, so obviously the system wasn't bootable; marking the partition as active and running the "fix problems" thing from the Win7 install CD twice made the virtualized system bootable, though. Wasn't super usable, but that's because of the quirky way I have %docs% and %temp% set up on my workstation. Verdict: p2v was a success!

Today I brewed a strong can of coffee and set out to test v2p. Created a vmware install of Win7-x64, using a version 6.0 disk format rather than 6.5 (not sure if this is necessary, but I noticed that for yesterday's p2v, PVM2010 created a 6.0 format - better be on the safe side). Did a few minor tweaks after install, nothing major; basically I wanted to be able to tell whether v2p was going to nuke my user settings (Microsoft's sysprep does that). Realized my external USB enclosure was fried and my largest usb flash is 8gig, so I had to copy the vmware disk image to my fileserver; thanks doyc for gigabit lan.

Proceeded to burn the WinPE based PVM2010 recovery/advanced-stuff ISO image, and booted my test machine from it. Entered the "Full Scale Launcher", connected my virtual disk image (networking worked out of the box, <3 WinPE), nuked the existing partitions on my testbox and did a "Copy Disk" operation. ~12 minutes and ~8 gigabytes later the copying was done and I disconnected the virtual disk. Finished the operation by running the "P2P Adjust" wizard (without injecting any new drivers), rebooted the system and crossed my fingers.

Lo and behold, the system booted. Started doing a checkdisk + reboot (probably a normal procedure after restoring an image), and then entered a "preparing your computer for the first time" screen. "Oh great", I thought, "it's probably sysprepped and will have wiped user accounts clean". But nope, after this preparation it booted into the customized user account just like I had hoped.

Verdict: v2p seems to work :D

This is going to be so comfortable for doing new OS installs... get everything right in vmware, from the comfort of your existing & smoothly working OS, without suffering reboots of the physical machine, or "oh gosh, I screwed up, reinstall time", along with the benefits of vmware snapshots and all that jazz. When satisfied, you have a quick image restore (rather than installing OS + apps and multiple reboots) and a superfast p2p-adjust operation, and you're good to go.

I'll do a more comprehensive test today, it's time to upgrade my laptop from Vista-x64 to Win7-x64. If that goes smoothly, I'll progress to getting a vLite tweaked version onto my workstation, before the Win7 RC I'm currently running expires :)
2583
General Software Discussion / Re: How to encrypt a USB drive without admin rights?
« Last post by f0dder on February 19, 2010, 04:06 AM »
AxCrypt is apparently good, but keep in mind that it's a usermode solution - meaning that to access the files, they have to be temporarily decrypted. This means the possibility of leaving residue on the computer where you're accessing the files; even if the temporary files are overwritten/wiped, it does mean they're temporarily available in plaintext... and if you're modifying them with a program that uses "save-to-tempfile-then-rename" in order to achieve safe saves, then wiping will not get rid of that residue. Also, if the target system is using an SSD, wear leveling means that the old data will be available.
2584
General Software Discussion / Re: Paragon Virtualization Manager 2010
« Last post by f0dder on February 18, 2010, 07:13 PM »
Win7-RC started bitching about 11 days/whatever left until it's gonna nuke itself, so I bit the bullet and purchased PVM 2010, with the intent of getting a decent RTM install in vmware and v2p'ing when done. I'll start with some testing, and report progress here. So far, it strikes me that the WinPE boot ISO mentions P2V but not V2P...
2585
General Software Discussion / Re: How to encrypt a USB drive without admin rights?
« Last post by f0dder on February 18, 2010, 06:56 PM »
If you mean for Flash Drives, some manufacturers offer a utility that will allow you to partition the drive into secure/unsecured areas.  Then when you run the program and enter the password the unsecured area gets demounted and the secured unlocked and mounted.  eg. Imation Flash Drives, Astone' FlashUtility.
Forget about those - most of those "secure" drives can currently be easily bypassed. Some of them don't have any kind of encryption, others have the flash cell contents encrypted but effectively always use the same passphrase - useless.
2586
General Software Discussion / Re: Why the aversion to .NET Frameworks?
« Last post by f0dder on February 18, 2010, 06:55 PM »
First: I consider C++ my main language, and I still do raw Win32 API coding. But nonetheless:

C sucks. C++ is pretty nice, but lacks a lot of stuff that you'll have to go to 3rd party libraries to get support for, and portability can become a real problem -and there's so many ways to blow off your legs, and a bunch of things that are pretty clunky. Win32 API level programming s-u-c-k-s, it's such an incoherent jungle because of all the years of legacy and different teams designing different parts. The PlatformSDK headers are an absolutely abysmal hellish collection of turd that still use preprocessor magic for stuff that really should be enums and inline functions (ever tried creating a function named WriteFile in your own class/namespace? Won't work).

I haven't worked enough with .NET to comment much on the framework yet, but C# seems like a pretty nifty language, really. And part of what makes it nice wouldn't be easy achieving in a non-JIT language, and can't really be "bolted onto" an existing language like C++.
2587
General Software Discussion / Re: How to encrypt a USB drive without admin rights?
« Last post by f0dder on February 18, 2010, 06:10 PM »
rjbull: the short answer is "if you want it done securely, then no".
2588
And here's an update: ""Microsoft Confirms Update-Linked BSODs Required Compromised Machines". Eat your hearts out, anti-MS people :)
2589
Living Room / Re: Antivirus companies support virus writers?
« Last post by f0dder on February 18, 2010, 09:54 AM »
True, but AV software can't "protect" you from a hacked legitimate site either. Drive-bys ... part of Common Sense (these days) involves reduced permissions & UAC which is a combo that even works on the 0 day stuff the AV types haven't had time to respond to yet.
UAC is nice, and I depend on a combo of UAC and FireFox with adblock+noscript - obviously noscript won't help me if a legitimate whitelisted site is hacked, though. And UAC wouldn't have protected me against the NTVDM local privilege escalation if I had been on a 32bit system.

OTOH an antivirus product (or rather, HIPS) depending not just on stupid static analysis but some decent kernel-mode hooks wcould add an extra layer of protection.
2590
Developer's Corner / Re: Help Me Decide How to Giveaway Some C++ Books Here on DC
« Last post by f0dder on February 18, 2010, 08:14 AM »
@f0dder - Effective STL perhaps?
Nah, wouldn't say so - fordummies/xx-timeunits tend to be pretty dumbed down. It's been a couple of years since I read Effective STL, but I definitely don't remember it (or other of Meyer's works, for that matter) as being dumbed-down to the point of uselessness :)
2591
Living Room / Re: Antivirus companies support virus writers?
« Last post by f0dder on February 18, 2010, 07:06 AM »
I've been harboring much the same ill feelings toward AV companies for years. I'm an advocate for common sense, it's twice as effective, uses (wastes) no system resources, and is free.
And unfortunately doesn't protect you against drive-by exploits on hacked legitimate sites :/ - the only thing I've been hit by the last 10+ years. (I still don't run any AV software, though :)).
2592
Developer's Corner / Re: Help Me Decide How to Giveaway Some C++ Books Here on DC
« Last post by f0dder on February 18, 2010, 07:04 AM »
There's some good books in that list, I own several of them already :) - I'd be interested in the Data Structures & Algorithms book, still haven't got one of those. And browsing the TOC, this one seems good.

I wonder if there's any "DS&A in 24 hours for dummies" books... :P :P :P
2593
Java / Re: Multiplatform...
« Last post by f0dder on February 18, 2010, 06:46 AM »
If you stick to core java (ie., no JNI, and be careful about stuff like launching external processes), don't depend on OS-specific file locations, et cetera... then your .jar files will execute on other platforms without recompiling.
2594
Living Room / Re: What books are you reading?
« Last post by f0dder on February 18, 2010, 06:43 AM »
erotic Western thriller
"Erotic western" and "western thriller" don't compute! :)
2595
Living Room / Re: Antivirus companies support virus writers?
« Last post by f0dder on February 18, 2010, 06:42 AM »
Welcome aboard, Dmytry :)

For the rentacoder remark - heh, i've been browsing rentacoder jobs once, and seen more than a few jobs almost certainly involving development of trojan software (private description, required ability to work with gmail, yahoo, facebook etc accounts, network programming experience, and you have to be located in former eastern Soviet block).
Nasty - that's pretty much as blatantly obvious as "trojan writers wanted" :/
2596
Living Room / Re: The interwebs are run by magic?
« Last post by f0dder on February 17, 2010, 02:52 AM »
Haha, that's a nice one :)
2597
And .NET programs don't really take that long to start, anyway. Paint.NET, for instance, is around 1 second on my machine - I find that acceptable. And C# is a pretty darn nice programming language :). I understand aversion to Java better, but only because most Java UIs look like crap and are pretty darn sluggish... the platform itself offer acceptable performance for a lot of stuff, really :)
2598
Developer's Corner / Re: Help Me Decide How to Giveaway Some C++ Books Here on DC
« Last post by f0dder on February 16, 2010, 04:28 PM »
What ideas to you have?
Ship them all to me or Jibz :)
2599
Are the HTML files formatted/structured in a consistent way? If they are, it should be easy writing regular expressions to extract the information. If it's "just a bunch of pages" that have names, addresses and emails as part of other content, that will probably not work :)

If the files are consistent, can you provide an anonymized version of one of the files? Ie., something that has all the header/footer fluff and general HTML structure (also anonymized, if need be), along with some anon name/addr/mail entries (if no fields are optional, a single entry would be enough - if some fields are optional, it would be good having a few entries so one can see what a missing field results in wrt. the HTML).
2600
General Software Discussion / Re: Mass checksum checker
« Last post by f0dder on February 14, 2010, 02:26 PM »
... (oh, and while the implementation model might be elegant, git as a whole definitely isn't - ugh!).
Can you elaborate? My understanding is that when Git first came out it was quite difficult to use, and the documentation was lousy, but it has since matured and those days are gone.
I'll try :)

First, let me state that I think the git model seems pretty solid overall: the way repository information is stored in the .git folder (and the way it's structured), the way server communication is done for remote repositories et cetera. I haven't looked into each and every detail (e.g. I don't know if file blobs are stored directly or if they're compressed (can't see why they would be)), but I understand the idea of storing blobs and referring to just about everything through their SHA-1 hash values (I wonder why SHA-256 wasn't chosen, considering some known SHA-1 defects, but not too big a deal - the foucs isn't to guard against attackers but to avoid collisions under normal circumstances).

My gripes are more around the end-user tools. One thing is that the Windows port is still a bit rough (blame then *u*x people for not writing properly modular and portable code), this is something I can live with - but gee, even after creating hardlinks for all the git-blablabla.exe in libexec/git-core, the msysgit install still takes >120meg disk space... subversion is 7.4meg. Of course msysgit comes with a lot more than svn, but I shouldn't need all that extra. And I don't currently have time to check what is absolutely necessary and what's just icing on the cake; considering that git was originally a bunch of scripts, and the unix tradition of piecing small things together with duct tape, I don't feel like playing around right now :)

The more important thing is how you use the tools. Sure, for single-developer local-only no-branch usage, it's pretty much a no-brainer, and most of the terminology matches traditional C-VCS. There's some subtle differences here and there that you have to be aware of, though - like what HEAD means. IMHO it would have been better to use new terminology for some things - like "stage" instead of "add" (having "add" overloaded to handle both add-new-file and add-to-staging-area is bad). "Checkout" for switching branches doesn't seem like the smartest definition to me, either. And not knowing about renames (but depending on client-tool to discover this, probably through matching SHA-1 values?) also seems like a bit of a mistake. Relatively minor points, but things that can introduce confusion...

Where things can get hairy is when you collaborate with other people, especially on juggling with branches and "history rewriting". Git has some very powerful features that lets you mess things up bigtime - which by itself might not be a problem, but with the various overloads the commands have, and that history rewriting (ie, commit --amend and rebase) seems pretty common operations, you really have to be pretty careful. Some of it aren't much of an issue if you're a single developer (although you can inadvertantly destroy branch history, which can be bad), but you need to be really careful with rebasing once you work with other people - the Pro Git book has a good example of why :)

All that said, I'm considering moving my own VCS to git. D-VCSs clearly have advantages
over C-VCS, and while the git-windows tools have rough edges and you have to be careful and
you can do some insane things, it's fast and I believe the underlying technology has gotten an important bunch of things right. I'll probably check out some of the other D-VCS tools before deciding, like bazaar and mercurial.

I am new to Git and have had nothing but a good experience with it so far. It is small, lightning fast and non-obtrusive.
Wouldn't say it's small (at least not msysgit :)), but fast indeed, and I like that it has a single .git folder instead of a per-subfolder .svn (that's un-obtrusive for me).

It compresses your data down to a bare minimum.
Does it actually compress anything (apart from server communications), or are you just referring to only storing each blob once, identified by SHA-1 hash value? It's also my understanding that files are stored in entirety, whereas other systems store patchsets. This makes checkouts extremely fast, but if you have huge files and only change a few lines, it does take up more disk space (usually not a big problem, most sane people don't store server logs under vcs, and keep source code files small... 50 committed edits to the main Notepad++ .cpp file would be ~16meg though :)).

However, I am not at all sure how well it would handle terabytes of data!?
Better than other VCSes... but it's not suitable for just computing hashes :)

I think Git is a little unsuitable, it keeps a copy of the whole file in one revision. It's good for distributed code, but not for file verifying.
I would disagree. Accurate file verification is one of the founding premises of Git. Yes it stores entire files, but it is very efficient. In Linus's talk, he mentions that the repository containing the entire history of the Linux sources (from 2005-2007), was only half the size of one checked out version of the source tree itself!
The point is that to "compute hashes" with git, you'll be putting files under version control. You don't necessarily want to do that for a bunch of, say, ISO images. Heck, I'd even say it's very likely you don't want to do this. First, you don't need the file under VCS, second you don't want the extra duplicate it creates (remember, every file will live in the .git object stash as well as a checked out copy).

Anyway, a lot of this discussion should really probably be split out to a topic about git, since it's drifted quite far away from the topic of file checksumming.
Pages: prev1 ... 99 100 101 102 103 [104] 105 106 107 108 109 ... 364next