topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Friday December 19, 2025, 5:55 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Recent Posts

Pages: prev1 ... 80 81 82 83 84 [85] 86 87 88 89 90 ... 364next
2101
Living Room / Re: Where have all the trackballs gone!
« Last post by f0dder on November 02, 2010, 01:43 PM »
I've been considering getting a trackball for a while, but there aren't a lot of options (at least not from standard shopping places, and I'm definitely not up for paying an insane amount for a premium device). A thing that has held me back for quite a while is that trackballs don't seem very suitable for gaming... but I don't game much these days, and it wouldn't be that bad digging out a standard mouse when I feel like gaming (especially not if the "Logitech Unifying technology" works and you don't have to plug in anything when adding a new device).

Avoiding carpal tunnel is much more of a concern now than whether the input device is good for gaming. A friend of mine got hit by this recently, and ended up buying a RollerMouse Pro 2. Seems like a pretty device (and insanely expensive!) nifty, although it takes a bit getting used to - different sensitivity than using a mouse, and you don't have your usual Mouse Muscle Memory. Bonus point for using optical sensors instead of being mechanical, probably also part of the reason why it's not exactly cheap.

The logitech M570 looks very nice, and is something I'll consider if they make a left-handed version, and ends up at a reasonable pricetag (the cheapest I can find it at here in .dk is ~USD100). A trackball still has the disadvantage that you have to move your hand between the trackball and the keyboard; might not be that bad wrt. CTS and strain in general, but it's still annoying. For the RollerMouse, the distance you have to move your hands is much shorter - I haven't used it enough to know whether you end up accidentally moving the mouse pointer around with your wrists, though :)
2102
superboyac: it's because of Oracle's tradition of lies, backstabbing, deceit and being just generally despicable. Sure, that's pretty much something you expect from any big corporation, but Oracle just... seems to be worse.

I'm sure somebody else can give some decent examples - I haven't kept links, just cringed every now and then after reading various tech RSS feeds :)
2103
General Software Discussion / Re: Slash your windows boot time
« Last post by f0dder on November 02, 2010, 11:23 AM »
Heh, looking at the main product site I have absolutely no clue what the program does - couple that with the amount of buzzwords, and it's something that I'm very very wary (and weary!) of.

Looks great in theory until you realise you shouldn't even have to worry about it, because the OS should already do it for you.
Indeed - but Windows does start a fair amount of stuff that a lot of people don't need. There's been a lot of work done by the kernel and "core OS" guys to reduce boot time, and it's being done on a lot of different levels; there's been some pretty interesting blogs and videos/interviews with Mark Russinovich on Win7 kernel goodness.

If all Soluto does now is look at boot time, then oh well - whatever. Especially if it's just by disabling and/or delaying load of services and startup apps. Sure, the graphics in the first post are pretty, and the idea of leveraging flock wisdom (I wonder what data exactly they're collecting...) is nice-ish. But disabling/delaying is nothing revolutionary.

And apart from HTPCs, is bootup time really that important? My workstation is usually turned on most of the day (goes to standby if I'm away from it for more than 10-15 minutes, shut off if I'm away for several hours). My laptop does standby or hibernation.

For me, the boot sequence spends a lot of time in BIOS initialization and all that jazz - from the Windows splash screen first shows to the desktop appears takes less time than the BIOS init sequence. Time until desktop appears and machine is usable is relatively fast, and most of the time is spent loading apps I'm depending on (FARR, LaunchBarCommander, etc.

And once you're at your desktop and ready, which is all a startup-management app can really help with, there's the issue of starting whatever programs you use for the task(s) you're going to work on... keeping your machine on (or resume from standby, or wake it from hibernation) is going to be a lot more efficient than micro-managing services.

But perhaps I'm missing something? I'm not going to install a piece of software with such a buzz-wordy tech-info-free site, especially not when there's this text (emphasis mine):
Our patent-pending low level driver technology detects when you’re frustrated by your PC and tells you which application is causing it.
Driver technology? If they're using a kernel driver to look at boot time, I'm a bit worried. If they're not using a driver, they're IMHO guilty of bad advertising.

EDIT:
OK, just watched their beta demo video - looks like it's not even managing services, just startup apps? (At least there will be no misguided recommendations to disable SuperFetch, then :P). There's a few things to say from that video, most of them positive:
  • The user interface is great. Probably results in the app being bloated, but it has a really great & intuitive look.
  • The boot-duration timeline could turn out to be useful.
  • The built-in wiki integration looks nice, but I do wonder if it's going to turn out well. A lot of users are, sorry to say it, morons :)
  • I'm interested in the after-bootup "My PC Just Frustrated Me" thing - what data is being sent? How can just sending whatever data be of any help without a description of what was frustrating? This looks a bit like snake-oil.
2104
Living Room / Re: Laptop choice: better CPU or more RAM ?
« Last post by f0dder on November 02, 2010, 10:50 AM »
FWIW: my girlfriend had 2x2048meg DDR2-667 laptop memory modules lying around that she didn't use, so I swapped my 1x2048meg DDR2-800 module for those. The only benchmark I've done is running WinRAR's built-in benchmark (x64 version, 3.92, multithreaded). Dualcore Core2 P7350@2.0GHz.

Performance went from ~1060kb/s -> ~1020kb/s... that's a 3.77% performance degradation, even though the RAM is theoretically 16.63% slower. OTOH the system should be able to take advantage of dual-channel memory, but then there's also the module latency to take into consideration - I haven't checked the latency of the modules.

Are you be able to feel a less than 4% performance drop? It's nothing, and compared to hardly ever hitting the pagefile because there's double the amount of RAM, more (but slower) memory is a clear benefit for me.

Different apps have different requirements, but even in the performance-critical apps, I dunno just which requires insane RAM bandwidth - as soon as you have heavy processing and aren't just moving data around, bandwidth is less important. And that of course also depends on CPU speed.

YMMV - but if you can get more but (slightly) slower RAM cheaper than less but faster, I'd go for more-but-(slightly)-slower.
2105
Keep in mind also that there's some protocol overhead, not just for the TCP (or, perhaps more likely for streaming media, UDP) but also the streaming protocol used; it's not going to be a lot, but it all adds up.

AndyM's suggestion of monitoring actual network traffic is good - just make sure there's not a lot of other network activity going on while testing how much the streaming uses.

My school has a policy that blocks most destination ports (thankfully not ftp or ssh - that would be quite annoying for a place teaching computer science :)), but the World Of Warcrack addicts bypass that using proxies...
2106
Living Room / Re: Software... Heal thyself!
« Last post by f0dder on November 01, 2010, 12:22 PM »
Interesting idea, but I'm not sure it's something that should be running on end-user's computers. And just how does it detect exploits? The article keeps mentioning crashes... a successful exploit doesn't crash the app it's taking over.

Seems to me like Microsoft devs might also have an easier time of it if they set up a whole roomful of these things pecking away at various Windows components...
Microsoft are already doing a lot of interesting things security-wise, but Windows is a doyc-damn huge beast... and there's a lot of different ways to exploit software.
2107
You'll have to update flash and java until all the security flaws have been patched up.

Oh, wait... :-\
2108
Living Room / Re: Laptop choice: better CPU or more RAM ?
« Last post by f0dder on November 01, 2010, 03:13 AM »
All these "glare-type" displays actually suck.
Depends on where you're going to use your laptop, and what you're using it for.

Makes it pretty much impossible to use outside in sunshine, but it's a nice crisp display when used inside. YMMV.
2109
Living Room / Re: Limewire shutdown, permanently
« Last post by f0dder on November 01, 2010, 03:12 AM »
Meat Is Murder - and I'm lovin' it.
2110
DC Gamer Club / Re: Minecraft - An Incredible Indie Game
« Last post by f0dder on October 31, 2010, 03:08 PM »
How do you get to the Nether? By "digging too deep"? Also, I thought torches were going to burn out, but, so far, the torches I placed before the update don't?
2111
Living Room / Re: Head in the Clouds web comic by DC member -- a new favorite
« Last post by f0dder on October 30, 2010, 10:51 AM »
Haha, that's a nice one indeed :D
2112
Living Room / Re: Axsotic 3D Spherical "Mouse" Ball
« Last post by f0dder on October 30, 2010, 10:47 AM »
6DOF isn't new, by any means - I had one of these several years ago.
sporb.jpg

The Axsotic might be interesting because it doesn't use mechanical sensors... the SpaceOrb was precise enough, imho, but for gaming you needed to set it's sensitivity pretty high, which my motor skills simply aren't good enough to handle :)

Since it doesn't use mechanical sensors, I wonder if the Axsotic will require you to constantly move the input device in order to move stuff; with the SpaceOrb, you could circle-strafe around a target just by holding the ball in the right position. If the target moved, you needed only slight hand movements.
2113
Developer's Corner / Re: The Composed Method pattern
« Last post by f0dder on October 29, 2010, 04:32 AM »
I find that in actual practice it's more convenient to limit methods to what can fit on my monitor without scrolling.  In other words, there's nothing wrong with a 20-line method, even though I could break it into two or three even smaller methods.  As long as you get the size of the method down to a reasonably small chunk of code that you can get your head around.
Really depends on what you're doing, IMHO. The advantage of splitting stuff into small functions, sometimes even as small as 1-3 lines, is that your source code reads out pretty well - eliminating the need for comments.

The practice can also be taken too far, though, resulting in hard-to-navigate almost-spaghetti code.
2114
General Software Discussion / Re: Best Executable Compressor Programs
« Last post by f0dder on October 29, 2010, 04:00 AM »
Hm, speeds up execution...

The application will load from disk a bit faster - in theory. Thing is, Windows does demand-paged execution, meaning it won't load parts of your application from disk until they're used. This is defeated by exe compression, since the entire module will be loaded at once (I'm not sure how much of a .NET assembly is loaded under normal execution... so perhaps this isn't such a big problem).

There's additional CPU usage because of the decompression, but the decompressors are fast - even on older machines, the time to decompress the additional data is a lot lower than reading data from disk (but again, we're ignoring demand-paging here).

HOWEVER, in reality, your executable is likely to load a lot less slower on a lot of end-user systems, regardless. Why? Anti-malware software. When doing on-demand scanning of executables, the anti-malware software has to decompress the executable before it can scan it - and for safety reasons, it can't just execute the executable's own decompressor, it has to have specific support for the exe-packer used, or run (slow!) sandboxed code emulation.
2115
Living Room / Re: Laptop choice: better CPU or more RAM ?
« Last post by f0dder on October 29, 2010, 03:54 AM »
I *think* it's when you get into video editing and gaming that you'll notice a difference.
Hm, does video editing require much of the GPU? I would expect CPU (de/compression of streams) and harddrive (speed, especially if dealing with HD content, and to some extent size) to be a lot more important :)

I've got a "Mobile Intel(R) Series 4 Express" GMA adapters in my laptop, which is slow compared even to the Intel onboard stuff of today (the laptop is 2½ years old) - and it's quite fine for running Win7, only lagging in games.
2116
Living Room / Re: Limewire shutdown, permanently
« Last post by f0dder on October 28, 2010, 09:17 AM »
Apart from game updates being distributed with BitTorrent (Blizzard WoW patches was the first time I saw this), there's lot of other legit content as well:
*  debian-506-i386-DVD-1.iso
*           done     4459.2 MB Rate:  82.3 /   0.0 KB Uploaded: 182964.1 MB                 [T  R: 41.03 low]
* Tracker: [Tried all trackers.]
-rtorrent

...and opposed to an application like LimeWire, there's numerous programs around implementing the torrent protocol, so there's not a single company to target, no single control server - and there's trackerless support for bittorrent as well.
2117
Living Room / Re: Laptop choice: better CPU or more RAM ?
« Last post by f0dder on October 28, 2010, 05:27 AM »
I've seen Windows 7 with 2GB memory and it's a lot slower than I'm used to with XP and 2GB ram, but then that could have been cause of a crappy CPU and absence of graphics card too :-\
I've run Win7/x64 on my 2GB laptop for quite a while, and comfortable used it for both Java and C# development - both Eclipse and Visual Studio are relatively RAM-hungry, and I've been running SQL Server Express on the machine as well (granted, very small development databases, but it's still SQL Server :)).

There is a version of this machine with a 256MB graphics card for about 550, but according to reviews that's a bit noisy then...
I wouldn't go for it - dedicated GPUs tend to use more power and get hotter (not nice in a cramped laptop), be noisier, etc. Especially not when you indicate the machine won't be used for anything graphics-heavy.
2118
Developer's Corner / Re: EtherCodes Collaborative Online Code Editing Pad
« Last post by f0dder on October 28, 2010, 05:11 AM »
It seems like a useful replacement for pastebin and the like... nothing more than that, but that by itself is nice enough.
2119
I really thought that just wouldn't work, should have educated myself first I guess :-[
Well, for anybody who have tried moving a system harddrive from one machine to another (doing "P2P" but without any fix-up steps) and experiencing an INACCESSIBLE_BOOT_DEVICE BSOD, it's a pretty fair assumption :)

Just imaging a system and plugging that directly into a VM probably doesn't work too well, there's usually some fixup step involved.
2120
Trying to run an XP install under different hardware (i.e. a VM) than it was setup on is asking for trouble, if not impossible.
P2V migration actually works pretty well - it's going V2P that tends to cause trouble.

Anyway, I'm definitely in favor of imaging the old XP install, then doing a fresh Win7 install and get things working steadily while being able to use the old XP install in a VM or on a spare computer. Definitely don't get into dual-booting or upgrade installs.
2121
Living Room / Re: Laptop choice: better CPU or more RAM ?
« Last post by f0dder on October 28, 2010, 02:23 AM »
And I'd get discrete video if at all possible.  Shared memory sucks *really* bad.  But it's one of those things you don't notice until you have it.
That depends a lot on what you're going to do with the laptop.

I've got Intel GMA on mine, and one of the relatively low-powered versions at that. The machine isn't suitable for gaming, but it works perfectly for Win7/Aero, Visual Studio 2010's WPF-based GUI, and watching HD movies (720p only though, the CPU can't handle 1080p).

Tomos: what's the machine going to be used for?
2122
Announce Your Software/Service/Product / Re: crack tracker
« Last post by f0dder on October 27, 2010, 02:32 AM »
Doing a key-value format, even as XML, doesn't bloat size out of the "emailable" realm - and the added flexibility is well worth it, IMHO.

But hey, it might work well for your purposes, and if you're lucky you might not need to expand later on... pipe-delimiting just isn't a good general solution (again, IMHO).
2123
Announce Your Software/Service/Product / Re: dot Net complaints
« Last post by f0dder on October 26, 2010, 01:52 AM »
The compacting is done to the CLR heap to remove memory fragmentation - doesn't necessarily mean the amount of win32 memory allocated for the CLR heap changes in any way.
2124
Announce Your Software/Service/Product / Re: dot Net complaints
« Last post by f0dder on October 25, 2010, 04:14 PM »
wraith808, garbage collection is done entirely at the CLR's mercy; depending on win32 memory process and which heap generation your object is in, when it's collected can vary a lot. This is just one of the reasons why you shouldn't depend on finalizers being called. Furthermore, just because a bunch of your CLR objects are being collected doesn't mean the used win32 memory is released - this makes sense because allocating system memory is "slow", so (if you're thinking only of the running .NET process and not the entire system) it makes a lot of sense to hang on to the win32 memory even if it's no longer strictly needed.

There's several different GC profiles your app can use, with different heuristics for when and how the GC works. There's also manual GC interaction you can do, but you should be really careful about this since it can seriously pessimize your app performance.
2125
Announce Your Software/Service/Product / Re: dot Net complaints
« Last post by f0dder on October 25, 2010, 02:27 PM »
For example, the immutability of strings allows the reuse of a single instance of a string value, without allocating multiple redundant values.
Possible even in C++, although that's usually implemented with COW, which can be a bottleneck in multi-threaded apps.

.NET also supports string interning, which in theory is cool, but has to be used very sparingly:
If you are trying to reduce the total amount of memory your application allocates, keep in mind that interning a string has two unwanted side effects. First, the memory allocated for interned String objects is not likely be released until the common language runtime (CLR) terminates. The reason is that the CLR's reference to the interned String object can persist after your application, or even your application domain, terminates. Second, to intern a string, you must first create the string. The memory used by the String object must still be allocated, even though the memory will eventually be garbage collected.

Also, the CLR design of generics is much more efficient than any other language/platform that I'm aware of. In some cases this can allow the source code to be much smaller. Basically, the definition of MyGeneric<MyClass> only needs to be stored once; whereas C++ for example must separately compile this for each different MyClass that's used.
Might be true for the IL code generated, but what happens when the JIT'er runs? :) - also, there's object allocation overhead every time you use a delegate... which includes the very innocent-looking lambda expressions. Setting up the closures might be relatively inexpensive, but it isn't free (I measured a 10x speed hit in object serialization because of a INotifyPropertyChanged implementation using lambda expressions).

And as mentioned earlier, we have to keep the difference between win32 memory usage and CLR memory usage in mind. There's reasons for it; not freeing win32 memory right away means subsequent CLR allocations can be done faster. But holding on to (win32) memory until system memory pressure is high enough might leave other apps deciding against, say, allocating more cache because the available (win32) memory is low. Pros and cons.

In general, then, some kinds of programs will take more memory, and some may take less. But that's really comparing the same program, ported to different platforms. I'm betting that if you design your code from the ground up with an understanding of .Net (or whatever platform you're building for), you should be able to come up with a design that meshes well with whatever criteria are important to you.
Wise words. Idiomatic .NET (at least C#) programming does tend to involve a fair amount of objects being created, though. Fortunately a lot of them are short-lived and get collected fast, not putting much pressure on the win32 memory. Still, there's a fair amount of memory overhead from the framework. This becomes pretty inconsequential on larger apps that need a lot of memory for <whatever> processing, but it can be noticable on small tools. Whether this matters depends on the situation :)

The CLR memory model is really interesting when considering long-running server applications; for normal managed apps, memory fragmentation can end up being a pretty big issue, unless you're writing custom allocators. With .NET, you get address space defragmentation for free.
Pages: prev1 ... 80 81 82 83 84 [85] 86 87 88 89 90 ... 364next