avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Sunday May 9, 2021, 9:45 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Shades [ switch to compact view ]

Pages: [1] 2 3 4 5 6 ... 111next
7-Zip vs WinRAR vs WinZip
Compression Rate
7-Zip vs WinRAR vs WinZip – Compression Rate and File Size
As for the compression rate and the output file size, 7-Zip, WinRAR and WinZip don’t differ too much. But the output format you choose can make a difference.

For instance, when you choose .zipx format rather than .zip format when using WinZip for file compression. The ZIPX format has a higher compression ratio than ZIP. If you choose 7Z format instead of ZIP format using 7-Zip to compress files, 7Z format would have a much higher compression ratio than ZIP format.

Based on test, to compress 1.5 GB of video files, 7-Zip delivers the highest compression rate, WinRAR comes as the second, while WinZip provides a compression ratio that is about 6% lower than 7-Zip. But if you choose the advanced compression format .zipx when using WinZip, its compression ratio is almost the same with 7-Zip.

Just mentioned it to clear things up, or was posting a result of "7-Zip is 75% better than WinZip" a joke that belong in here?

Not a joke at all. And I wouldn't have mentioned those results if I had not been able to do so myself, about 3 to 4 times a week. If the person from the article you quoted couldn't reach those results, he/she didn't play enough with the available settings. In any of the archivers. By all means, play around with dictionary sizes and word sizes inside these archivers. And a whole new world will open to you.

And why did that person choose video files for compression? If there is one type of file that always delivers negligible compression results, it is video. And MP3 files too. If you wish to compress video files significantly, try different (re-)encoding them with a different codec, such as the x265 codec.  Sure, the encoded file may not play anymore on your Smart TV directly (if it only supports the x264/H264 codec), but play just fine with VLC/PotPlayer/Media Player Classic/GOMPlayer/etc. on your desktop/laptop and you will hardly see any difference watching a video file that is 600 MByte (x265) and its original of 2,4 GByte (x264). No archiver will come close to those results.

Files such as database dump files, log files (for auditing purposes/troubleshooting) etc., these file-types are very much compressible. Database dump files I encounter on a daily basis of 20+ Gigabytes each, compressing these with 7zip and ending up with an archive of each that is around 400 MByte. On rare occasions even less. Neither zip or rar are able to come close.

It's no skin of my back if you choose to disregard though. After all, if you are happy with your solution, more power to you.

If you are the impression that WinRAR has great compression, you really haven't tried other compressors. WinRAR's saving grace is when you distribute multi-part archives and you have a need to create par files to reconstruct broken parts of this multi-part archive. That is more or less the only thing that WinRAR does right. But with internet as it is, how often is that functionality necessary? In the days of 33k and 56k modems, that was repair functionality was handy, nowadays it takes less time to download an archive, check it against a hash and re-download it in case the hash check fails. Repairing multi-part rar archives didn't go always ok either.

Compared with zip archives, rar archives can be significantly smaller. That was what I was always told. Until I actually started testing zip, rar and 7zip. Seriously, rar could compress to 50% of what zip could do. Which is good, don't get me wrong. 7zip reduced file-sizes till 25% of the rar results. When you have data-files several gigs in size, then you really see how good 7zip is. And no, it hardly takes longer to create or extract these archives either. You are more bound to the write capabilities of the drive you extract the archive on, than 7zip itself being slow.

There are even more specific compressors out there which do an even better job than 7zip can, but creation and extraction of archives made with these compressors does take significantly longer.

For my use case way back when, 7zip saved me hours per week. Made lots of scripts that do the automatic archiving and extracting and it still saves me almost 30 minutes each time I need to create a build, pull that to my side of the ocean, extract it and push it through a significant battery of regression tests, all without manual interaction. For a piece of freeware, 7zip is practically like gold. As example: each build is about 2 GByte in size, 7zip creates a 200 to 250 MByte archive from such a build, with WinRAR I'm already happy if the archive is 600 MByte. The extra 5 to 10 seconds it takes to create and extract these 7zip archives vanish like a snowball in hell when the transfer (and verification) time is 10 minutes for the 7zip archive and more than 40 minutes with rar archives.

Archives are part of a strategy/procedure being employed for whatever your reason may have. And there are gains to be had when taking a good look at the whole strategy/procedure. Zip is the most conveinient archiver, but that one won't win you the race. Neither does rar, but that one shows a decent effort. 7zip really is much better especially if you have it dialed in.

The above has nothing to do with silly humor, so here is my contribution:

Aren't you glad there is no Windows Phone anymore?  Mine came with off-line maps already build in. And could use cellular masts to do a "poor man's GPS" triangulating position. Used it in this city myself (on foot), some 4 years ago, maybe 5 by now.

Here is a nice overview from available Git Clients and for which operating systems these are available.

After a period Fork won't start anymore.

The Fork Git client is 49 USD if I remember correctly. My boss is the user who deals with both CVS (his favorite versioning system) and Git, the favorite of the main architect in the company receiving the code we produce. And he found this to be a very reasonable price for Fork. To date it is also a one-time fee. If this changes in the future? My crystal ball is too fogged up to see, I'm afraid.

Have been using GitExtensions (free/open source) and Fork (paid) successfully with a remote hosted GitLab (enterprise version) for the last 2 years. There is a need to connect with a VPN client to their VPN server first (2FA only), before I can access their GitLab instance. Hence I expect not much of the safety features in GitLab are activated, the VPN barrier is the security.

The above might be an idea for you too, with regards to securing your GitLab instance.

Back to clients, GitExtensions is pretty stable en pretty extensive. However it misses one very important feature. If you do a diff between files, it won't let you do this side by side, just one view where all changes are shown where they happen. Who ever thought that is sufficient or even efficient, has never worked with side by side diff views. But the creators behind GitExtensions are adamant that a side by side diff are not necessary. And they mention: you can link your favorite diff software to do side by side diffs, so GitExtensions opens it for every file that has a change in it. All nice and dandy, if you have small sets of files that contain changes. But that is not the case where I work. Using that method with a few 100 files and you'll know what a hassle this is.

Fork does side by side diffs natively. It also used to be free, but 6 months or so ago they changed their policy and my boss paid up immediately. I have been playing a little bit with: Guitar  which is a free Git client that does side by side diffs, but there does not seem to be editing capabilities within that view. It has other things going for it, though more time is needed to play with it.

A bit of a rant about Atlassian:
Atlassian is a company with an attitude. We use their Jira product. For legal matters we can only use it as an on-premise product. While they first sent out a communication that the Jira on-premise product would still be supported in the future, their most recent communication now simply states that in 2024 all support drops and that there is only the cloud version of Jira available. Although Internet has improved a lot in Paraguay, it still isn't as stable as it should be. So, even if there were no legal bounds (with financial consequences) to consider, cloud solutions are not just as ideal as a solution as an on-premise server is to me.

But this forced move by Atlassian creates more problems than it is worth. So, their stance on other products like SourceTree can also change on a dime. While it is their right to do so, for my particular use-case they screw me over. Ah well, at least they give enough warning, so there is time to look for alternatives.

Overview from command-line audio players:
MPXplay (this one has a CUI, not a GUI)

You can create a WAV file with a 18Hz tone yourself with Audacity (open source). 18Hz was the frequency where I stopped hearing it. 25Hz was still as clear as day. Which is weird, as I am way closer to 50 than I like to admit and I have not been kind to my ears all those years. Or you could use the 10 seconds of glorious 18Hz WAV file I attached to this post.

Personally I prefer portable applications, so for my situation an example batch file would look like:
Code: PowerShell [Select]
  1. D:\PortableApps\MultiMedia\Audio\MPXplay\mpxp_mmc_x64.exe -pre D:\PortableApps\MultiMedia\Audio\MPXplay\18Hz_44100Hz_16bit_10sec.wav

Save this script as: Play_18Hz_continuously.bat     or    Play_18Hz_continuously.cmd    or    Play_18Hz_continuously.ps1

This would play the 18Hz file in a loop. Whenever you turn on your computer, you start this little script too and you'll have what you need.

And if you want to make sure it starts when your computer boots, you could also use the Windows Task Scheduler to configure a task that executes this script when your system boots. Then you don't even have to remember to activate it at all. 

Then I remembered you mentioned playing an inaudible, low frequency sound to help with your BT issues and so here I am, asking about it. :D

If, for example, you don't use Windows Media Player for anything, you could set it to play the same (tiny) audio file in a loop. Getting a 25Hz WAV file will be the hardest part to do in this workaround.

But if you do use Windows Media Player (?!?!!), you could substitute it for a simple portable audio-player with a tiny GUI that you can easily hide from view or one with no GUI at all. If this player doesn't have a 'loop' mode, you could use the Task Scheduler built into Windows to play the sound every 10 seconds for example. 

A tiny batch or powershell script can automate the few steps this workaround takes to play the sound, so it works with a simple double-click. Quick and easy.

Non-Windows Software / Re: file backup tools for linux
« on: April 08, 2021, 09:52 AM »
Bacula and BareOS are the first that come to mind.

BareOS is a fork from Bacula, so they share their versatility of backup methods and storage media. They also share a very unfriendly way of configuring the devices and/or backup scenarios, compression/de-duplication etc. Everything fully automated and you can add a web-interface to your setup, making it easy to have overviews from what has been backed up when. Once you get your head around their method of doing these things, it isn't that hard. Just takes a while to get your head around it.

Myself I have a Bacula setup and it has proven to be a rock solid solution for 10+ years.

Some key mapper tools show which application laid claim on any given key combination. Find out which application did do so and if there is an option to alter this key combination inside the application itself.
If that is not an option, try to alter key combinations inside the key mapper of your choice (remove first, then recreate the combination and assign it to the appropriate application).

How is your 'Thrash can' configured? You might throw files away after which Windows should report that more space has freed up. But that doesn't always work as smoothly as you would expect (by now). As suggested earlier, using the disk cleanup tools built into Windows fixes this.

Also, over time you Windows installation gets bogged down with applied patches and updates. Windows keeps a spare copy of each, so you can then roll back/forward your system. There are ways to get rid of those files, but do this with caution. Depending on the amount of patches/updates you gave collected over time, you could reclaim several GBytes on your C:\ partition. See this link for more info.

How is your 'pagefile.sys' configured? Depending on how that is configured, you could reclaim several GBytes on your C:\ partition. By default this is set to let the operating system handle it automatically. Over time the size of this file could have grown to a size that is unnecessary. You can disable the automatic management and set the minimal and maximum size of this file to the same size. I always use  a size of 2 times the available RAM when there is 4 GByte of RAM or less. If there is more than 4 GByte of RAM in the laptop, I set the min./max. values to the same amount of available RAM.

Windows 7 defrags your HDD drive on a weekly basis (by default). Once in a while you could use a different tool to defrag your drive. Disk Defraggler (freeware) can also defrag empty space on your drive. Often this results in a little bit more free space.

Do you use the 'Restore points' feature of Windows? By default this is enabled. Depending on how many of these restore points have been generated over time, you could reclaim a GByte or two on your C:\ partition. If you feel competent enough you could disable the feature and remove all remaining restore points. If not, you can reduce the amount of restore points manually or configure the feature to do this automatically for you.

There are usually two folders in the root folder of your C:\ partition, called: 'Temp' and 'TMP'. Open these folders, select everything and delete these files. The temporary files which are still ion use will not be deleted, but most of the collected cruft will. Don't expect to reclaim much free space with this though. You could do this on a regular basis, but not too regular as it can negatively affect your whole computing experience. Myself, I don't care about the results of emptying these folders, but people consider me weird over here.

Those are the immediate tricks that come to mind for reclaiming space on your drive.

More clarification about NFT's.

Taken from Slashdot:

When you buy an NFT for potentially as much as an actual house, in most cases you're not purchasing an artwork or even an image file. Instead, you are buying a little bit of code that references a piece of media located somewhere else on the internet. This is where the problems begin. Ed Clements is a community manager for OpenSea who fields these kinds of problems daily. In an interview, he explained that digital artworks themselves are not immutably registered "on the blockchain" when a purchase is made. When you buy an artwork, rather, you're "minting" a new cryptographic signature that, when decoded, points to an image hosted elsewhere. This could be a regular website, or it might be the InterPlanetary File System, a large peer-to-peer file storage system.

Clements distinguished between the NFT artwork (the image) and the NFT, which is the little cryptographic signature that actually gets logged. "I use the analogy of OpenSea and similar platforms acting like windows into a gallery where your NFT is hanging," he said. "The platform can close the window whenever they want, but the NFT still exists and it is up to each platform to decide whether or not they want to close their window." [...] "Closing the window" on an NFT isn't difficult. NFTs are rendered visually only on the front-end of a given marketplace, where you see all the images on offer. All the front-end code does is sift through the alphanumeric soup on the blockchain to produce a URL that links to where the image is hosted, or less commonly metadata which describes the image. According to Clement: "the code that finds the information on the blockchain and displays the images and information is simply told, 'don't display this one.'"

An important point to reiterate is that while NFT artworks can be taken down, the NFTs themselves live inside Ethereum. This means that the NFT marketplaces can only interact with and interpret that data, but cannot edit or remove it. As long as the linked image hasn't been removed from its source, an NFT bought on OpenSea could still be viewed on Rarible, SuperRare, or whatever -- they are all just interfaces to the ledger. The kind of suppression detailed by Clements is likely the explanation for many cases of "missing" NFTs, such as one case documented on Reddit when user "elm099" complained that an NFT called "Big Boy Pants" had disappeared from his wallet. In this case, the user could see the NFT transaction logged on the blockchain, but couldn't find the image itself. In the case that an NFT artwork was actually removed at the source, rather than suppressed by a marketplace, then it would not display no matter which website you used. If you saved the image to your phone before it was removed, you could gaze at it while absorbing the aura of a cryptographic signature displayed on a second screen, but that could lessen the already-tenuous connection between NFT and artwork.
If you're unable to find a record of the token itself on the Ethereum blockchain, it "has to do with even more arcane Ethereum minutiae," writes Ben Munster via Motherboard. He explains: "NFTs are generally represented by a form of token called the ERC-721. It's just as simple to locate this token's whereabouts as ether (Ethereum's in-house currency) and other tokens such as ERC-20s. The NFT marketplace SuperRare, for instance, sends tokens directly to buyers' wallets, where their movements can be tracked rather easily. The token can then generally be found under the ERC-721 tab. OpenSea, however, has been experimenting with a new new token variant: the ERC-1155, a 'multitoken' that designates collections of NFTs.

This token standard, novel as it is, isn't yet compatible with Etherscan. That means ERC-1155s saved on Ethereum don't show up, even if we know they are on the blockchain because the payments record is there, and the 'smart contracts' which process the sale are designed to fail instantly if the exchange can't be made. [...]"

In closing, Munster writes: "This is all illustrative of a common problem with Ethereum and cryptocurrencies generally, which despite being immutable and unhackable and abstractly perfect can only be taken advantage of via unreliable third-party applications."

Living Room / Re: Show us a picture of your.. CAR!!!
« on: March 29, 2021, 11:51 PM »

Buy a Rivian. While it hasn't dramatically more range, it is a pick-up truck. So, you'll have room on the back for a generator to charge the battery.  :P

All kidding aside though, after watching documentary series: 'Long Way Up' I was impressed with the Rivian truck. The documentary is about 2 old friends and movie-actors, who travel from the most southern point of South America to Los Angeles, each on highly experimental electrical Harley Davidson motor bikes. They also use 2 Rivian electrical pick-up trucks as support vehicles. The Rivian trucks did not break down during the whole trip. And only 1 of the electrical prototype bikes needed repairs when arriving at Central America.

In the beginning the actors and support teams really had to get their minds around how they should use electrical cars and bikes. Took quite some doing. And as the most southern point of South America is not that far from Antartica anymore, the temperature spoiled the fun on more than one occasion. They also traveled through Patagonia (Argentina), which is huge, very sparsely populated and electricity is not a given there.

Once they started to travel through slightly more densely populated areas, they were accustomed to the electrical vehicles and you start to hear more and more praise from the whole team. The whole trip was about 23.000 miles. And they had to drive through very punishing terrain, sometimes at really high altitudes too.

What I saw was the Rivian pick-up truck has some really smart storage spaces available. And you can sit 6 adults easily. It appeared to me that those trucks have as much usability as a mini-van, as they do as pick-up truck. In the documentary, the Rivian factory also enabled rolling charges. If you are out of energy, the truck can be towed for a few miles where it will charge it's batteries. Not sure if that functionality will be available in their commercial models.

Living Room / Re: Movies you've seen lately
« on: March 20, 2021, 11:43 PM »
Overlord is/was indeed a good Anime drawn style story.

Attack on Titan is, in my opinion, an even better story.

When I was younger I watched a lot more Anime/Manga. Nowadays I hardly watch, as there is too much drab. However, Attack on Titan and Overlord are modern examples of two good stories. Don't diss these too quickly, just because these are anime. With Attack on Titan it really is better to start watching it with a slate as blank as possible.

Attack on Titan - trailer:

Season 1 is OK, Seasons 2 and 3 are definitely better than the first. The 4th and last season is still being broadcasted on a weekly basis.

The Wallpaper download sites I frequent do indeed use the current resolution as it is set in Windows as the default resolution of the image download. However, they do not complain/balk about me selecting a different resolution as the download.

Is this such a problem? After logging in, most of those websites have a user settings screen where you can set your default resolution. 

To put your mind even more at ease, download and run the following software:
Cure It!  from Dr Webb is free to download and run (for private use). On a reasonable system a full checkup lasts about 15 minutes (1 TByte spinning rust disk, about 50% full). Unless it finds malware issues on the system, then it will take longer, depending on how many malware is actually on the system and if you want the problematic files deleted, moved or cured.

Not a small download (200+ MByte) and nowadays you need to give your mail address, so have a disposable one ready. However, it really is very helpful when you are in need of finding/fixing malware. Usually the downloaded software works for a few days and then it tells you it is out of date.

Instead of downloading new signature files, you will need to download the whole thing again. It also generates a random filename after each download. There is more than enough malware/adware that is aware of file names from software that is able to remove malicious software and/or file names from software that allows you to see what is running in the background (like Process Explorer). The random file name will prevent malware /adware to block this software.

So, if you have software like Process Explorer on your system and you cannot start it, your system has been infected with malware/adware and you are definitely in need of software like ADWCleaner (free/private use), JRT, RKill and Cure It!.

** edit: additions

A good example for applying the 3-2-1 rule when back-upping.

For those unfamiliar with that rule:
You should have 3 copies of your data (your production data and 2 backup copies) on two different media (disk and tape) with one copy off-site for disaster recovery.

And a reasonably interesting blog post on why this rule sucks...  ;)

Living Room / Re: Gadget WEEKENDS
« on: March 08, 2021, 06:12 PM »
You have 2 RTX3080 video cards in your new PC?  Better put in a 1200 Watt power supply. When both video cards have a 100% load, the 1000 Watt power supply will make your system unstable.

RTX3080 and 3090 cards generate really high peaks and you will find to your chagrin that the 1000 Watt power supply will "choke" on those peaks. A bit of over-provisioning in power supply capacity is not a luxury with those beasts unfortunately. A computer with a single RTX 3080/3090 and a 800 Watt (platinum) power supply is cutting it close. 1000 Watts sounds like a lot. And it is. But 128GByte or RAM consumes quite a lot already, All those drives operate at very high speeds and require quite a lot of power as well. Let alone the CPU and chip-sets on the motherboard. You'll see that even those chip-sets have (passive) cooling as well. Why? Because they use a lot of power and generate quite a lot of heat too. Combine all that with those video cards and you might even find that a 1200 Watt power supply isn't up to the task. I kid you not.

Anyway, congrats on a awesome system. Wish I had the money myself, or lived in a country where it would be easy to get gear like that. No, Amazon/eBay etc. are hardly an option here, as Paraguay has borderline criminal surcharges for importing computer gear.

Well, in my case (and if I understood correctly, also Deozaan) if can't have the gear that you love, love the gear that you have. My old 2009 clunker has SSDs and 32 GByte (4x8) of RAM. But I found that 16 GByte (2x8) performed better. Not all motherboards and chip-set combinations handle dual-channel memory well when 4 RAM slots are filled. Leaving 2 of those empty is actually faster on my machine. Video encoding (x265, 10-bit) of a 45 minute video file took 3 minutes less with 16 GByte than with 32 GByte.

To Deozaan my advice would be to get 2 of the largest size RAM modules your motherboard supports and flip your current RAM module(s) on eBay or something. As your system is of comparable age as mine, the maximum size per RAM module will likely be 8 GByte. So get 2 of those. Kingston and Corsair are decent brands for RAM that work (and keep working) on the specified speeds over their functional life. Also check the maximum speed of the RAM modules your motherboard supports.

Info like this is usually easy to come by on the support pages of the motherboard's manufacturer website.

For a clear (and pretty!) explanation about what all markings on SD cards actually mean, here is a less than 7 minutes long youtube video.

General Software Discussion / Re: Driver Backup Utility
« on: March 03, 2021, 08:48 PM »
Thank you. I will check it out.  From a quick glance, it does not look like it has been updated since 2010.  By any chance have you run it on Windows 10 64bit?

Yes, I can personally confirm that it works Windows 10, Windows Server 2016 and Windows Server 2019 too.

Over here I have a Lenovo Yoga 500-something (it is a 2-in-1 model, just like you have). 6 screws are required to remove the bottom half of the laptop and that is it, full access to RAM, HD. Could hardly be any easier.

However, I do think that when a complete disassembly is required, the hardware is oriented more optimally for getting rid of generated heat when the device is in operation. Or at least less constricted by solid parts the block the natural rise of heat. The cooling fan inside your laptop should have to work less hard than then when hardware is oriented in my Lenovo.

Replacing the (original) HDD for a simple SATA model hard drive is the best thing you can do for your HP device. I did the same with my laptop and although it isn't a speed demon by any means, it is much more pleasant to work with. However, Windows 10 migration from Windows 8.1 (the original OS) made the laptop very slow after some 30 minutes of use. Only a browser with 2 or 3 tabs open and it became slow immediately. It also took between 48 and 72 hours before the battery was fully charged and regardless of this, the laptop screen would dim an get brighter whenever it thought it was charging/on battery. Which was even more irritating than the laptop not "waking up" properly after closing the lid.

Re-installed Windows 10 from scratch (using the method on the Microsoft site for installing Win 10), same problems. Very frustrated I figured I had nothing to lose to try Linux (ended up with: Pop_OS! from System76).
No more dimming, battery charged to 100% in a pretty short period, resulting in the screen not dimming every few seconds anymore, opened browser with 10+ tabs, while listening to internet radio, actively using VPN for remote working and opened document in the LibreOffice word processor. All without a hitch for any period of time. Laptop is "waking up" much more reliably than it ever did under Windows 10 too.

If you are not squeamish about Linux and you wish to give your HP device a second lease of life, it might be something to consider for you too. In a lot of ways the Gnome interface of Pop_OS! is a drastic improvement on the mess that Windows 10 can be (when you need to configure computer settings).

Sorry for repeating the Linux story from my laptop. Just thought I should mention it as an consideration.

If you want to expand your RAM, the manual states that 8 GByte is the maximum. So if you have 2 RAM slots, you can use 2 x 4 GByte RAM modules to get to 8 GByte. It is possible that the 4 GByte you currently have in that laptop are 2 x 2 GByte modules. In that case you will need to replace both of them. Maybe you are lucky if you can get rid of your old RAM modules, but don't expect to get much for them.

But you might be fortunate and have only 1 RAM slot occupied with a 4 GByte RAM module. You could gamble and buy another 4 GByte RAM module. Pay very good attention to the type of RAM you are buying, else you'll end up with the wrong type. RAM modules that use DDR 3 technology come in two types. If you'll get the wrong type, your laptop won't boot.

Even if you get the correct type, there might still be the possibility that both RAM modules are incompatible in combination with each other, while working just fine separately. Best way to go get a new set of 2 x 4 GByte RAM modules (same brand/make/model) of the correct type and Windows 10 will perform quite a lot better. Your old single 4 GByte RAM module is much easier to flip and for a higher price too.

Still, while you will notice the effect of increased RAM capacity in your laptop, it pales in comparison with exchanging the standard HDD with an SSD. That is an order of magnitude more noticeable. Depending on the storage capacity of the SSD, it is cheaper too. The price difference allows you to buy an empty external HDD enclosure and you can build your current HDD into that enclosure. You haven't lost any data this way, you have sped up your device considerably and you gained an external drive with storage capacity you were already used to. Hook that external drive up to one of your USB 3.0 ports and it will be faster than whatever SD card you wish to use and you won't be bothered by that 32 GB limit.

EDIT: Shades snuck in before me :P

Only because of my laziness to look up the SD standards conversion tables for names given out by standard committees and names given out by manufacturer's marketing departments.....  :D

The service manual from your device states the card reader is made by RealTek. The HP driver download website tells me that the most modern driver for your device is: Realtek Card Reader Driver 10.0.10125.21277 Rev.C.

You can download a slightly newer driver from the RealTek downloads section of their website, but there is likely not much to gain by that. Anyway, On the Realtek website I found that set of chips they use for card readers is labelled: RTS5169

There is where you will find any and all formats the SD card reader supports. But from what I gather on the last link, no support for the fastest SD card models. TBH, couldn't be arsed into looking up what speeds it does support as SD card manufacturers and their standards committee make almost as bad a mess as the fools that man the current USB standards committee. Both are equally deserving of being taken behind the shed for a similar painful way as well, if you'd ask me. And I know you didn't.

Also, your device is sporting a 6th generation Intel CPU. And not the best ones available in that generation, so you shouldn't expect too much from your card reader regardless. As always, when you have high I/O demands, you better have a powerful CPU (with equally powerful supportive chipset) capable of processing that amount of I/O. Especially a problem in low- and mid-range budget laptops which by definition need to compromise on hardware, because of cooling limitations, battery-life and parts prices. On a side note: the current Intel CPU family is designated as 11th generation, so a 6th generation CPU is getting a bit long in the tooth in 'hardware years', which is quite similar to the concept of 'dog years'.

Depending on how airflow is managed inside your laptops, there might be heat related issues popping up when you keep the lid closed.

Some designs use holes in the bottom plate of the keyboard as vents to let hot air out. And their efficiency will be drastically reduced when covered with a closed lid, which may generate heat of its own.

Some designs have openings in their chassis when the lid is open, but those same openings are covered when the lid is closed (as protection against picking up debris inside carrying cases).

If desk space is such an issue, you better trade one laptop in for a Intel NUC model. Those don't take up very little room and are designed to be (passively) cooled, while still packing a considerable "punch" given their size.

Because a KVM switch was not asked for. And no, no expensive routers or anything were required with the software solutions provided here.

Sometimes I wonder why I even bother.

Pages: [1] 2 3 4 5 6 ... 111next