avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • October 17, 2019, 05:34 PM
  • Proudly celebrating 13 years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Shades [ switch to compact view ]

Pages: [1] 2 3 4 5 6 ... 103next
I'm not familiar with the forum software you use on the back-end of your site, but most (if not all) have a 404-page (or landing page as you call it) built in. It is on you how to dress that page up, as some only display one simple line of text. You can "dress" that page up in whatever way you want, using a (WYSIWYG?) HTML editor. Of those there are enough free ones around.

Personally, I would spend the time to make a rather nice looking (yet simple) 404-page with a link to the home page of your forum, a (funny) message regarding the 404 error and the overall theme of your forum, plus a message about being automatically re-directed to the home page and use the HTML feature called: re-direct.

That make users getting the 404 message automatically be redirected to the forum home page (or any other page you like). You can specify how much time such a redirect should take place. 30 Seconds, maybe a minute or something like this.

While it doesn't do much for search engine rankings, your users and search engine bots do appreciate a proper 404 page.

How easy or difficult all of the above is, that depends on the design of the forum software. And the willingness/time to learn a bit about the underlying system of the forum software you use and a bit of basic HTML.

For your reference: examples of 404-pages from established websites.

You can ask Google to crawl through your site again. Normally sites are being re-indexed every few weeks (for low traffic ones) automatically. - this link provides some info about re-crawling.

Landing pages are good, sitemaps are better. Having a web site structure that Google expects you to have, will increase your ranking. Having other sites refer to your content will get you much bigger jump in ranking though. But only if those pages have a good ranking too. Highly ranked sites are re-crawled more often than low ranked ones.

While it is good that you moved away from the vBulletin forum software on your site, you can expect some issues by changing the forum software that you run as your back-end. 

How motorcyclists think people react when they drive by...

Then let me counter with this video. You have to get through the first 2 minutes or so with a narrator who is not too fluent in English and spouts what he does with a heavy accent. But after those two minutes or so, the engines on the bikes start doing their "thing"...

Honda CBX 1050:

Go for the Education versions, as these are mostly akin to the Enterprise versions of Windows 10. N is, as far as I know, a version that has some built-in Windows multi-media options removed. European legislation is to thank for that. If you don't care or use your own set of multi-media software, instead of the ones provided by Microsoft, get the N version.   

You can use Education versions as long as you are registered as a student at an institution that Microsoft acknowledges as a school/university. Although I don't think they actively check on that. Depends also in which part of the world you live.

Wikipedia link to an overview of Windows 10 versions and their capabilities. That should give you an idea which version to choose.

Windows 10 build 1809 looks to be the more stable one. People migrating from build 18xx to 1903 have reported many problematic errors. Maybe a complete fresh install using build 1903 fares better, but I'm not sure of that. Usually a fresh Windows 10 install works better than migrating from one build to a newer build. Also, you can expect even more problems when you would migrate from build 17xx to 1903.

But as you want to keep as much as possible from your Windows 8.1 installation, you limit yourself to the migration option. A plain Windows 8.1 installation is not likely to give you a lot of problems during the migration to Windows 10, but you can expect migration problems, especially when using older/specific software or hardware (drivers).

If possible, create a Virtual Machine and install a trial version of Windows 10 (same build as you are going to get from your university). Then try if the software you depend on works inside this virtual machine. I remember you are using some very specific (and older) software. If that software remains working to your satisfaction, then go through the motions of migrating your computer and discard the Virtual Machine you created.

But if your software isn't working, then keep using Windows 8.1 until Microsoft drops support for it officially and see what you can do about getting a newer version of your software that does work with Windows 10. Or go and find an alternative for your special software, see if that works and migrate to that alternative software on Windows 10.

There used to be a tool, hosted by Microsoft, that could make an educated "guess" if your current hardware supports Windows 10 or not. But I think that was for Windows 10 builds 15xx. For all intents and purposes Windows 10 build 15xx isn't officially supported by Microsoft anymore. Even builds 16xx are out of support (with the exempt of LTSB version of Windows 10, if I remember correctly).

Understand that there are big differences between Windows 10 builds, so if your software works with one build, it is not a given that it will remain working in the next Windows 10 build. Enterprise versions of Windows 10 allow you to postpone migrations from one Windows 10 build to a new build for 1 year maximum. Education versions of Windows 10 are in most respects the same as Enterprise versions of Windows 10, so I assume that you have the same option to postpone.

It comes down to the ability of your special software to work on Windows 10, to see if migrating your computer to Windows 10 is a good idea or not.

I have no problem surfing with any browser on any system here internally to visit my 'internal only' mediawiki installation. Which can be accessed by typing the following:     Works in FireFox, Palemoon, Chrome, Opera, Internet Explorer, Edge (both).

Chrome/Chromium-based browsers do have the issue as reported when using names. But that's because these browsers consult external DNS servers first and these don't know about your internally hosted website(s), so that ends up in a fail.

To combat that is to run your own DNS server internally and configure on your router (or the device that hands out the IP addresses in your LAN) to use the internal DNS server as primary and the external DNS server of your choice as secondary, tertiary etc. That's what I have done here and works for the Chrome/Chromium-based browsers.

Way too much effort for a standard home network setup? Sure. But you must keep the Chrome/Chromium-based browser fanatics appeased. Even if you can show that FF is the better browser in this regard.

Living Room / Re: Hotmail hates non-Hotmail: is there a cure?
« on: September 23, 2019, 08:24 PM »
No, the mail server you use to send your email message(s) to Hotmail appears not to be configured as the email server at Hotmail expects it should be. Because of that any message that it receives from the mail server that you use will be regarded as spam. Always, that is until the mail server admin(s) at Hotmail decide that mail messages from your mail server should be thrown away immediately. Then those won't even be delivered in the SPAM box of the receiving mail account anymore.

Has to do with certified domains, DMARC, reverse DNS settings, the server you use being on a blacklist of sorts, SPF, DKIM....any configuration error on your mail server in any or all of these items I just mentioned can be enough to trigger the mail server at Hotmail to mark your messages as spam.

Mail services like Google's GMail, Hotmail and pretty much all mail services you need to pay for have all the things I mentioned configured correctly and won't give you problems sending mail to Hotmail. But if you use an obscure service, get a better mail service. Because these items are known for a while now and any professional mail service have these items applied on their mail server(s).

What mail server and/or service do you use to send your mail with? Or do you run your own private mail server? In that case, you must start to read up and apply these items as soon as possible, because more and more mail servers apply these to combat spam. While some items can be added rather easily to your own mail server, others require contact and configuration at the end of the service where you get your IP addresses from.

Remove the gateway setting in the network settings and only allow admin(s) to adjust the network settings. That will prevent internet access, while the LAN remains to work in a fashion. Meaning that if you know the IP numbers of the machines in the LAN you want to "talk" to, you can still access them normally. NetBios settings should take care of granting access to computers in the LAN by name, but as I don't know how you have set-up your network, that might not work. Can be problematic with shared printers, depending on how you have shared (a) printer(s) in your network. That is, if you do not run your own (properly configured) DNS server in your LAN. 

Is a pretty brutal solution, but you are sure no-one's attention is diverted to the internet. Maybe with finding ways around the block you just have put in front of your students. But that's another matter.

Which of these programs can effectively post to the Net? (presumably by creating an HTML file).

Allowing that you may have to purchase a domain name and use your hosting platform.
i seem to remember one of the old school ones was good at this like TreedbNotes.  but im pretty sure rightnote can export to html.  onenote cannot directly, but you can export to word and then from word save as html.

Friends don't let friends use word to convert anything to HTML. Just saying...

As an example:
I have an extensive Word document. The docx version is almost 6 MByte in size. After I do a manual conversion to HTML, the resulting document is a bit over 4 MByte in size. When I use Word as converter, the resulting HTML file is almost 17.5 MByte in size. This Word document I have also converted to the AsciiDoc format and that is just shy of 2 MByte in size.

Any version of Word creates a pile of drudge when converting a document to HTML.

N.A.N.Y. 2020 / Re: Systemus
« on: September 09, 2019, 11:19 PM »
A Handy Dandy System Admin Tool

Good! One of those "why haven't I done that" ideas.

Unfortunately not the case: GEGeek Toolkit  When the guy started with this project he offered it as freeware. Nowadays you must donate before you even get access to it. I found an old freeware version of this software and it is impressive. All tools you need and like to do every (maintenance) task you can think of on a Windows computer. Most of it is portable software, some software will install. This also keeps an eye out for updates of the software you use with GEGeek and it allows you to easily launch the software of your choosing.

As the version I have is old (2012 if I remember correctly), there are quite some issues with available software and the updates. But you do get what an Herculean effort this is and why he (more or less) demands donations for the latest and greatest. Working with that old version (or trying to make it work by yourself) isn't recommended.

GEGeek, by now, is very extensive and I am usually of the opinion that less is more. So I am interested in what the OP can come up with. 


Interesting news. I do have to say, something about the fact that the guy who wrote the article referred to himself and quoted himself in the third-person several times throughout just seems "off" to me. It seems a bit misleading.

Nonetheless, I can't say I'm surprised by the news. As someone who used to be a big proponent of Google and their services, it's sad to see what's become of them.

How was the saying: "Die as a hero else you live long enough to become the villain". Or something like that.

Google, after becoming a daughter-company of Alphabet, slid down the slippery slope even faster than it did when it was a single entity. At least, that is the impression I got from them. But not all blame lies with them. Companies/corporations that (mis-)use the collected data need to be blamed too. And shamed as well, even after paying a heavy fine. Perhaps things would fix themselves if heads of these companies become personally and therefore financially responsible for collected data.

After all, one learns lessons much quicker when it either financially or physically hurts.

If you barely play games and if you play older titles (4 years or older), you could go for a GPU that is passively cooled. I try to get those card for workstations (as those shouldn't game at all), but also for personal use. They can handle quite some load and once I had one that had such a large metal convector mounted, that I could add a standard 80x80mm case fan onto it (once the owner started to complain that his system would reboot "at random").

A "there, I fixed it for you"-solution? Yes, but it kept going for another 2 years. Likely it still works, but it is an AGP model.... I am not allowed to throw away anything, unless approved.  ;) Case fans are easy to come by and easy to replace. GPU coolers are not. Getting a second hand one for donor purposes, you should ask how much those donor fans had to work, before the chips gave out.

New passively cooled GPUs usually come with (much) smaller metal convectors, so extending "life" by adding a case fan is often not an option anymore. But my boss is getting a 3D printer and when that thing works, options will be back. That is to say, I can live/work with a PC that has fans working at full power all the time. Because that is the thing with case fans, not all models come with hardware that allows their RPMs to be adjustable. Just 'not at all' or 'full on, baby'. That I more often than not use a headphone when working, sure helps in that regard.

By default, GPUs in desktops (towers) have their fan(s) point downwards. Which is not the most ideal position, as air is sucked into the card, heated and then blown out to the bottom. Dust particles in that air can and do collect inside the GPU cooling fan(s). And dust tends to stick to fan blades too. With enough time, those cause an imbalance that becomes worse until you actually start to hear it. By that time, it will be very hard to get those fans back in proper working order again. By taking those fans apart, it is also not a given that their construction allows you to put it back together again in a proper fashion. The times I have tried, it was always working well for a week or two, but then problems would be back, often with a vengeance.

GPU fans are often small and of an irregular size, even between models from the same manufacturer. Replacement or repair isn't always an option. So, most people go and buy a new GPU. Meaning there is no incentive for NVidia or AMD to give a hoot about repair/replacement of GPU fans. And in countries where you can get replacement parts easily, who cares... 

Here in Paraguay, Amazon, Newegg, Ebay and the like are not an option. It can easily take 3 or 4 weeks for things to arrive from the US at the PY border and your order gets slapped with a border "tax" that can be ridiculous. Given that there are not many stores in the capital that sell PC hardware. The few that do, charge a lot. So much, that if you have need for one or more parts that costs you 200 USD or over, you better take a vacation day and take the bus to a border city about 450 kilometers away from the capital and go buy parts there. Yes, the price difference makes it worth to travel.

A time-sink this huge, that is always a possibility though, so repair is still a thing here in these parts of the world.

General Software Discussion / Re: Looking for AsciiDoc editor
« on: September 01, 2019, 11:30 PM »
It was the same here. For my own uses I run a wiki and would have been happy to keep documentation that way. Mark down would have been my second choice at that time. And at that point in time I had never heard of (or cared about) AsciiDoc either.

But I have been going through the manuals of both Mark Down and AsciiDoc, have gotten quite some experience with Asciidoc and it is easier to write documentation in AsciiDoc and the syntax is more human-readable than Mark Down (especially when you start to use more complicated items/structures in your documentation).

By all means, keep using what you are familiar with. This whole thread started as a question, but could be seen as a reference or starting point now. For those in need, of course. And if I find new and/or better tooling for working with AsciiDoc, I'll update this thread again.

General Software Discussion / Re: Looking for AsciiDoc editor
« on: August 30, 2019, 11:18 PM »
Update time:

I have been very busy the last 2 weeks converting a document that has been growing for over 15 years. Not only is the document extensive, it is also filled to the brim with internal and external references. Although the document looks rather simple when looking at it in Word, it isn't and I suspect that got PanDoc a bit of it's rocker and produced a pretty big mess after conversion.

So the last weeks I have been busy "taking the document apart in the tiniest pieces, created templates for those pieces, repaired whatever was garbled up by PanDoc and start building it back up again.
 In the mean time I have worked a lot with AsciiDocFX, Brackets (+ asciidoc plugin), IntelliJ Idea (v2019.2 Community edition +asciidoc plugin), Eclipse (+asciidoctor plugin), VSCode 1.37.1 (asciidoc plugin) and Notepad++ with asciidoc extension. The last one is more like a new programming language to be added for colored syntax. There is really nothing more to it.

My experiences so far:
All editors, with the exception of Notepad++, consume a boatload of resources when working on more complex documents. VSCode was the worst of the lot in my case. After 30 minutes or so, it would use around 6GByte of RAM and continuously between 80% and 90% of all CPU resources. Proper previews were a problem as well. Not a success.

Then I tried IntelliJ Idea. That also consumed almost as much RAM and CPU as VSCode did, but that was somewhat justified as the preview worked better, but it would also validate syntax/style and show you where you were making (minor) mistakes. While that last part is very handy when working with more complex documentation, it was still too much of a burden on this PC (A10 APU at 4GHz with 24GByte of RAM).

By that time, I was thinking "to hell with it" and used Brackets. Having tried that editor a few years back and not liking the experience one bit, this time around it was pretty nice to use with AsciiDoc. There is no real-time preview available, so it isn't consuming that much resources. You can however enable a preview at your convenience. The preview isn't as complete as the one from IntelliJ IDEA, but way better than the one from VSCode. There is also a section in the preview that shows you syntax/style errors (rudimentary, but still).

For "funsies", I also tried Eclipse again with the now nearly finished document. The real-time preview functionality in that editor is standing head and shoulders above the rest regarding rendering speed. A very pleasant surprise that was. It takes between 10 and 20 seconds to do a complete re-render of a document that describes almost 600 script commands and some of those are very extensive.

Feature-wise AsciiDocFX is the best, it's real-time preview isn't fast, but also not as complete as others, which limits my use for it. But Brackets and Eclipse were pleasant surprises, each in their own way.

So if you have relatively simple AsciiDoc documents to create, AsciiDocX is probably your best bet. For conversion and/or repair of existing documents (with some complexity), I would say to focus on Brackets and Eclipse. Brackets, if you have grokked enough AsciiDoc syntax and can work without real-time previews. Or Eclipse, if there is a need for a real-time preview that won't slow you down that much.

Oops, forgot about Notepad++. The syntax highlighting works rather well and as it is the least extensive editor of the bunch, it is pretty fast. But without a preview option, you'd better have a pretty firm grasp of the AsciiDoc syntax and have a very clear idea how you want your new document to be structured. That requires a lot of discipline, which people that code for a living have less issues with than other mortals. Its usefulness as AsciiDoc editor is therefore limited for most.

All of the editors discussed in this particular post can be used as a portableApp, if that is a thing for you.

AsciiDocFX can be finicky. I have used it on many different computers with lots of different versions of Windows and never gave any issue, until I tried it at home. There is a continuous error about the JVM not able to start because of max memory allocation. No matter what change I made in those settings, it just refused to start. Yet, IntelliJ and Eclipse are also Java-based and have no issue working on this system. Something I thought worth mentioning for those considering editors.

Well they have LINE, it just hasn't been updated in a couple of years and won't work with 64-bit yet.  I wish that it would be updated...

A real shame that it was never ported to 64-bit. I had pretty good experiences with Portable Ubuntu Remix. But as colinux is only capable of running 32-bit editions of Linux and (as far as I know) Ubuntu doesn't bring out 32-bit versions of their OS anymore. Although migrating it from the 2008 version of Ubuntu till the most current one would be a chore, I would give that a go.

General Software Discussion / Re: Goodbye, Bitbucket!
« on: August 23, 2019, 10:43 PM »
Did find some projects on Sourceforge, I think, that were supposed to enable web access to CVS repositories. Also abandoned, just as the client. Whatever was there, I could never make it work. In the version of Jira (issue tracker) that I have running there is functionality to hook CVS up to it. That way it should be possible to have a list of changed files per issue tracked, the CVS comment, the amount of changes inside each file and links to a web interface that shows the actual files and their changes.

Getting the file list overview was not a problem, the amount of changes and CVS comment are also correctly displayed, but never got the web interface where the actual changes could be seen to run.

SCCS originated in 1972!?!? And here I was thinking that CVS was old and feeble...

It has been decided though that GitLab (on a dedicated server) must be used as replacement for the CVS repo.

Been taking a look at this 'Fossil'. Looks interesting enough and likely suffices for a one-man shop or a very small team (caveat: I have been spoiled with am used to Jira). Thanks for that one, though.

Not really simple with regards to capabilities, but simple enough to install quickly on the server that hosts the files. Try Nextcloud. User management and (read-only) access to files can be pretty fine-grained. There is a community and enterprise version. i run the community version for quite some time now and it doesn't disappoint.

Caveat: I run it on a Linux server with a 12 year old Core Duo (Dual core) processor and only 2 GByte of RAM. Still, it runs just fine for about 15 people.

General Software Discussion / Re: Goodbye, Bitbucket!
« on: August 22, 2019, 01:40 AM »
I keep trying to learn how to use git because it seems like all service providers only support git and I want to take advantage of some of the cool features these services provide. But for one reason or another I get frustrated with git and give up and go back to mercurial.

(G)It doesn't get simpler than with the NitroGit client (not free), although there doesn't seem to be a time limit to their evaluation version. It has a different look than any other application and it would be understandable if that isn't your "thing". But it is simple to use.

Never got into SVN after several attempts, Mercurial I have never tried. There wasn't much time for it as I was bound to CVS (yes, that old beast). But there is talk to convert/upgrade the very active but also almost 20 year CVS repo to Git/GitLab. That will be fun...

Ah well, Git cannot be worse than CVS. The last freely available version of that software is from 2005, the WinCVS client is also from that year and has been stagnant for almost 14 years. Mercurial came and practically went in that time... That puts things in perspective  :P

When I was still transferring my CDs to audio files, I used CDex (v1.51 if I remember correctly, it has been quite a while). As far as I know it supports FLAC (and WAV and MP3) and it's freeware.

I can already rip DVDs easily with DVD Decrypter and Handbreak. But they don't allow the option to selectively choose which chapters to include and which to exclude. To be more precise, Handbreak allows you to select a range of chapters to include, but it won't allow you to omit any chapters within that range.

So when I saw that VideoProc had DVD ripping capabilities, and that it allowed me to select arbitrary chapters to include in the ripped video, I thought I could clean up a couple videos I have.

I'm pretty sure it wasn't a problem with copy protection because in actuality the "DVDs" I ripped were actually ISO files that I had previously ripped with DVD Decrypter which I think doesn't include the copy protection in the generated ISO file. I suppose I could try it again using the actual DVDs, but I don't expect any better results, and in fact as you mentioned the copy protection may interfere with it.

Try MakeMKV (which is beta software, but has been stable for the years I have been using it). As far as I know you can use it's DVD ripping functionalities for free, you have to fork over money for the ripping of BluRay discs. If you are familiar with Handbreak, you really have to try and work with VidCoder. That particular software uses the Handbreak code, but adds a lot of functionality using a much easier (to me) interface to do your "thing".
For cutting up (and pasting back together) videos you should get: MKVToolnix.

The problem you describe sounds to me like your software is showing the content of the last buffer frame before the removal of the offending part. DVDs still use tricks to mess up playback in computers. Tricks that do not affect DVD players which you connect directly to your TV. BluRay players are more sensitive to such tricks, but those come with internet connections and enough hardware/software to update themselves with new DRM scenarios, depending on the content of the discs being played with it. So, these shouldn't suffer too much issues either.

So, you better cut out offending content in existing videos. MKVToolnix may not be the easiest software to work with, but it's powerful, hence it will get the job done.

Living Room / Re: Buying a 2TB SSD to replace my 2TB HDD
« on: August 17, 2019, 10:35 AM »
Reducing the write cycles on a SSD is still a wise thing to do.

In my system there is boot SSD drive from 120GByte, which is split into 3 partitions. 1 is the tiny one Windows itself creates for it's boot procedure. 2 is the C:\ partition, which is 30 GByte (between 7 and GByte is free), dedicated to Windows itself. 3 is the D:\ partition that holds my portable apps and program files. 4 is a 10GByte section of empty space, to be used by the drive for error management.

Then there is a 3TByte SATA drive for my data, but also a partition that contains a page file with a static size of ((2 x amount of RAM in the PC) + 20%) and a set of portable drives for backup purposes.

Some explaining:
NTFS is the most common file system on Windows. It performs best when it's partitions have between 10% and 20% of free space. But more free space is preferred. Making partitions helps you to achieve that goal.

Also, NTFS is a file system that makes a mess of how it stores files on disk. That is by design. Which is why it needs a relative big chunk of free space and file management (defragging) to keep up performance. This makes it faster than the standard EXT3 or EXT4 file systems on Linux, for example. But only when there is enough free space available. And when there is insufficient free space available, performance drops below the performance of Linux file systems quickly. These Linux file systems fragment much slower and suffer much less performance problems when drives are being filled to the brim. That is also by design.

Severely simplified: NTFS packs files very close together, which results in less 'travel' of the hard disk heads. Which is in essence a good idea, but only with static files. When files grow or shrink, this dense packing results in files being chopped up, making the hard disk head travel more, instead of less. The Linux file systems spread out files, which initially makes the hard disk head travel more. But files don't fragment that quickly this way, because there is room for them to grow or shrink.
More modern file systems follow the design ideas of the Linux file systems more closely as these give you a stable performance. And their extra performance comes from better interaction with the operating system and smarter ways to handle the actual reading/writing of files.

Yes, partitions create artificial limits on your drive(s), which may cause you problems along the road if you didn't properly set the partition sizes for the tasks you have intended for the computer. But partitions make the background maintenance NTFS needs much easier on your system. It saves wear and tear on spinning drives and keeps things organized.

There is also another consideration. Especially when virtual machines are being used. Keeping Windows separated from (portable) applications and user data gives you a very clear advantage. Most virtual machines offer you to assign partitions from the host to each virtual machine you create. Installing Windows and especially configuring such a VM can be quite time-consuming. But with portable apps, you can cut out the time spending on configuring. Assign the correct VM drive letter (in my case D:\) to the host partition and every shortcut works just as well on the host as in the VM. You also can continue with your tasks in the VM right where you left them off on the host. It also saves a ton of storage space on the host, which reduces time you need for creating backups considerably as these can be a lot smaller.

Not making partitions makes life easier during setup of your computer. Afterwards it adds (unexpected) complications. You are spending much less time setting up systems than using them, just saying. And yes, Microsoft is hell bent on dumping everything in C:\ . That sense of initial 'simplicity' is a false one, creates lots of opportunities for MS and 3rd parties to sell you software for helping out with complications they created themselves.


Living Room / Re: Buying a 2TB SSD to replace my 2TB HDD
« on: August 16, 2019, 07:37 PM »
And my questions for y'all are these:
  • When SSDs first became popular, everyone warned how heavy usage would run them down, but that's a long time ago now.  I'm sure they are more viable now.
  • What SSD should I buy?

  • Yes and no. The first models were indeed troublesome. To fix those problems manufacturers started to use better quality components. And that did work out well for the costumers. But better quality components means more costs, so nowadays manufacturers use different techniques to get away with more or less the same service life, but at a much lower price for them. In a way that is good for the consumer as well, as prices dropped significantly. But you should take into consideration how you are planning to use your SSD(s). For some use cases, it is better not to get SSD(s) that use MLC. It won't perform as well and has a shorter service life. The controller on those SSD(s) are decent enough and do make the best use of the capabilities of those SSD(s).
  • You hardly go wrong with Samsung drives. But those have a price tag that your budget might not agree with. I have here a few SanDisk SSDs deployed and I must say that those perform good enough for their price. Those are way cheaper than Samsung. I read positive stories about Kingston SSD drives too. then again, you are going to trust your data with those drives, so the more expensive Samsung drives are likely your best bet.

Same here. 3 Year old Win 10 installation with Office 2010 on it.

But in my case, I go out of my way to use portable applications and if not available, I'll try to make the application 'portable' myself and if that doesn't work, the application will be removed. Having said the above, it does require me to install stuff on a semi-regular basis, so I thought your script to be useful in finding out how much crap remains, even after software has been removed (using Revo UnInstaller). But that is the extent of cleaning I have done on this particular computer.

If it is of any help, your script returned 2 orphans, one of them was: ccc-utility64. The other: Microsoft SQL Server Data-Tier Application Framework (x86).

Living Room / Re: Windows 7 always slow after idle
« on: August 15, 2019, 04:16 AM »
You know it is pretty easy to make Windows 10 behave like Windows 7. This article (How-To-Geek) gets you there most of the way. With this tool (NeoWin), you can mimick Windows 7 on Windows 10 even more. And this article (AskVG) discusses how you can enable the glass effect from Windows 7 in Windows 10.

With O&O Shutup you can disable a lot more MS telemetry than the provided options by MS themselves. And with this tool (intowindows) you could exchange the Windows icons that come with Windows 10 for the same icons, but from Windows 7.

That should cover most, but likely not all, reasons to not abandon the Windows 7 bandwagon and jump onto the Windows 10 one. Once you have done all these steps, the experience with your BIOS on your motherboard will be better and you are more likely to get all the intended performance out of by using the Windows version the manufacturer designed it for.

Living Room / Re: Windows 7 always slow after idle
« on: August 14, 2019, 10:11 PM »
I can already see this from a mobo vender.. oh hey.. urgent update your bios to mitigate intel's latest cpu vulnerabilities.. oh btw your will lose all your raid configs.  :o

If you are afraid of that, don't use the built-in RAID options of your motherboard. Either get a dedicated 3rd party RAID card or go the software RAID route.

In case you go for dedicated hardware, buy two of the (exact) same cards. That will save you a lot of money when you want to retrieve data from your RAID setup when (not if) it fails. Data retrieval from RAID drives is difficult, time-consuming and therefore really expensive. You think that extra card is expensive? You easily pay 2 or 3 times that price before the data retrieval company even wants to look at the RAID mess you got. Advantage from dedicated RAID hardware is speed. You think you have good speed with the built-in RAID hardware? Dedicated hardware trumps it. Easily.

Software RAID is also quite fast, only slightly less than the built-in RAID hardware. It is more stable, usually relatively easy to repair/reconstruct and won't be affected by a BIOS update. I am running one for over 14 years already (on a Linux machine) and has not failed on me. At one time the original motherboard got fried after the unstable grid fried both the UPS and the motherboard. Maybe I should have said baked the UPS, as there was a small, but nonetheless open fire involved.

It was running on a Intel-based mobo and there was only a spare AMD-based mobo available. Swapped out the motherboards and started the machine back up. It showed warning messages about the different hardware that was detected, then the Linux software downloaded whatever drivers it needed automatically and one reboot later the whole server, including the software RAID, was spinning like nothing had happened. That was the experience I had with Ubuntu Server LTS.

Before I changed to that distro, the company decided that it should run on CentOS as that was the distro other developers were using to develop on. The hours of rebuilding the RAID that were lost after grid "hiccups" with that distro...amazing in a very bad way.

Of course, my experiences are anecdotal, but in this place no more CentOS. Ever.

On a side note: Had to do the same trick on the mail server I run on-premise, only now from AMD to Intel, worked again with Ubuntu Server LTS. Oh, there is something to mention, none of my Linux servers have a GUI installed. I assumed that helped a lot when swapping motherboards/processors.

Anyway, the software RAID does improve the speed and reliability of the data you store on it. And in my experience way more stable than any hardware based RAID solution.

Although, nowadays I wouldn't even consider RAID. The file systems: BRTFS, ZFS and the like, have practically all good qualities of RAID already built into them. Makes RAID redundant (pun intended). Just get fast drives.

Linux and BSD operating systems have the option to install these new file systems, if those aren't already included in the OS. While Windows is still stuck with the NTFS file system. Sure, you have a choice between NTFS and FAT32. Two aging systems. Yes, I am aware the NTFS has gotten a lot of new features over the years, and it is reliable within reason, but having the possibility to add different file systems would have been very welcome by now. When Windows Vista was being developed ("LongHorn", anyone?), Microsoft happily announced they were busy developing a new file system that would do most of what BTRFS/ZFS can do. Yet MS couldn't kill that attempt of progress quickly enough. Instead, only a small subset of those features have found their way into NTFS. Better the devil you know, I'll guess.


Living Room / Re: Windows 7 always slow after idle
« on: August 13, 2019, 12:44 AM »
Is the BIOS setup to expect your boot drive to be in a specific (SATA) port on your motherboard, while it is physically connected to a different (SATA) port?

If so, turn your PC completely off and either change the hard disk cable of your boot drive to be in the configured port, or adjust the configuration to expect the drive to be in the port that it is currently connected to. Whatever you feel is the most easy to do.

Some motherboards come with 2 different SATA chip-sets. It might be the case that your hard disk is connected to the one that is not allowed or able to boot from after the BIOS update. It always helps to read the change log from the BIOS update you upload into the BIOS of your motherboard. It might be that lots of people encountered problems when booting with older BIOS versions and that they chose to disable booting from the problematic SATA chip-set.

Pages: [1] 2 3 4 5 6 ... 103next