topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Wednesday November 12, 2025, 11:53 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Recent Posts

Pages: prev1 ... 81 82 83 84 85 [86] 87 88 89 90 91 ... 264next
2126
Living Room / Re: 10th Anniversary - long time member check-in thread
« Last post by IainB on July 16, 2015, 10:53 AM »
...and we'll leave a seat at the table.
__________________

...or a channel open on that frequency...
2127
Living Room / Re: Interesting "stuff"
« Last post by IainB on July 16, 2015, 10:47 AM »
My daughter got back from school one day a short while back, and proceeded to make and play with some oobleck.
I was fascinated, never having heard of it before.
It vaguely reminded me of what was called "silly putty", but it wasn't the same thing by any means.
2128
...forgive my whining, but do I really have to read an entire book, before using this one? Does it work out of the box?  :tellme:
____________________________

That link goes to 404...
As to "Does it work out of the box?" - Why not install it and see?    :tellme:
2129
Details of some useful stuff for blocking/deleting garbage and cookies in Firefox:
  • BetterPrivacy: a useful extension that removes LSOs in Firefox, as described below:
    Remove or manage a new and uncommon kind of cookies, better known as LSO's.The BetterPrivacy safeguard offers various ways to handle Flash-cookies set by Google, YouTube, Ebay and others...
    ________________________

  • Cookienator: this is not an Add-On, but a program. I use it in Windows at start-up and occasionally throughout the day.  It removes cookies simultaneously from these browsers and Flash local storage:
    • IE
    • Firefox
    • Google Chrome
    • Safari
    • Flash local storage
    Note: You can manually add extra domains to the list it has, to block cookies from - e.g., facebook.com

  • Blocked Add-Ons in Firefox: this is an informative list maintained by Mozilla and which Carol Haynes linked to in a post many pages earlier - here,

  • Privacy Badger: from eff.org blocks spying ads and invisible trackers.

  • Self-Destructing Cookies: if you already had this Firefox Add-on, you may find (as I did) that it had been mysteriously deleted without telling you, presumably as part of what seems to be have been a "registration" scam that Firefox has apparently been quietly operating against selected add-ons recently. You can see why this particular add-on might be "unwanted" by Mozilla's sponsors, as it is so effective in automatically clearing cookies left on your hard drive after you leave the website that just wrote those cookies to your drive.    :o
    All the more reason for reinstalling that particular add-on and any others deleted by Mozilla in like fashion, and for looking for an alternative browser developer that doesn't play these sorts of games against its users. To protect yourself, keep a list of your add-ons, so you can refer to it periodically to see what has been removed. If you use Extension List Dumper, that would give you a list and the file location for the add-ons, so you could retrieve it from back-ups. If you use  FEBE, CLEO or OPIE, then you can just restore the deleted add-ons right away.  :Thmbsup:
2130
Ah, many thanks @lanux128. That's very helpful of you.
I shall see if I can get the AHK approach and the control panel display to work nicely together, but from what you say you have already established that they probably can't. Bother.
Time to experiment, methinks.
_________________________________

Bother, it seems that you can't have your cake and eat it.

Though I haven't tested all these points (below) under all conditions, it seems that the Logitech software controls all the headset functions/buttons. Thus, if you disable (bypass) the G930 headset's Logitech driver and software, then:
  • (a) The little switch on the side of the headset that switches between stereo and Dolby surround sound is effectively disabled and you get stereo only, in either position.
  • (b) The G930 control panel software showing levels, equalizer, mic and avatar voice controls, headset battery status display, and surround sound status/test is not invoked (does not run).
  • (c) The programmable buttons G1,G2 and G3 are disabled (though in my experience these may not always work properly with Winamp).
  • (d) The mic presumably works as an ordinary mic, but the headset mic mute button might not work, and the mic sensitivity to background noise and filtering of same, and the listen-to-mic may not work.
  • (e) The headset volume control wheel button might not work.

Thus, with software disabled, one's experience in use of the headset would seem to be a negating of most of the headset's extra functionality and superb listening experience, for which one has paid $$$. There would be at best only a pass-through listening experience with unmodulated stereo output.

So, I wouldn't recommend buying the G930 headset if you do not intend using the software/driver.
2131
Living Room / Re: TV shows thread
« Last post by IainB on July 10, 2015, 03:35 AM »
...But speaking of Prisoner, it seemed like every time I would make a move to ask the boss for a raise, that bouncing ball thing would smother my efforts.  ;)

Hahaha. Nicely put.
2132
I've not got Win10 anything yet, though am awaiting its scheduled release later this month.
What you say sounds interesting. Anything that might improve the ergonomics or efficiency of the ON GUI is probably good news for users.
2133
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 10, 2015, 03:08 AM »
Good luck with the wooden deck repairs. No fun. I prefer tannalised timber or concrete (low maintenance).
2134
General Software Discussion / Re: Firefox and Cyberfox release 39.0 stable
« Last post by IainB on July 10, 2015, 02:27 AM »
@TaoPhoenix: Yes, I reckon you put it very well, and this point in particular is familiar to me because it is tried-and-tested standard/"best" business practice, as taught in Business School 101 (Financial and Management Accounting):
...2. In between things being made directly illegal, is another layer of "corporation games". These include MS's sinking of Nokia via Stephen Elop, the ousting via outing at Firefox, buying up the niche players and then purposely shuttering them while swiping a useful piece of tech knowhow. ...

It is a pragmatic 4-step asset-stripping method:
  • 1. Acquisition: aquire/buy up (through hostile or friendly takeover) the desired assets and liabilities of the competitive product/producer/technology and run it as a discrete and separate function, for a while.
  • 2. Stripping: progressively strip any useful productive competitive assets and IP, absorbing/reorganising them into your main business.
  • 3. Preparation: shut down/mothball all residual/unwanted operations and resource costs that remain (including laying off now "non-essential" personnel), treating any losses or costs (e.g., discounted scrap sale, redundancy payouts) as "acquisition costs" or "business transformation costs" and as a charge/write-off against profits, for tax minimisation.
  • 4. Disposal: sell/dispose of the remaining assets and liabilities treating any losses as "acquisition costs" and as a charge/write-off against profits, for tax minimisation.

This is what is often termed as being just "business as usual" for the average profitable and cost-efficient corporate psychopath. Any non-profit focussed distractions (e.g., image-building, "pet" projects of the CEO, or philanthropy) along the way are likely to be seriously unprofitable and if taken to excess may even eventually cause a business to collapse - a sobering example of the truth of this would be Control Data Corp.
All's fair in love and war...

2135
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 09, 2015, 03:35 PM »
@ednja: Thankyou for explaining about why you consider that file size is an issue for you, and for describing the nature of your population of files (Guitar Pro tab files). I downloaded one to examine the contents, which looks to be a mixture of ASCII and binary data.

From what you say, I think you may misunderstand why size is largely irrelevant and best not used for comparing files.
Take a hypothetical example, where you have 2 files composed of just ASCII text:
File #1 contains only the character string "ABCDE"
File #2 contains only the character string "VWXYZ"

Since the binary data values that represent a single ASCII character are fixed at 8bits (1byte) in length, then the total file size in each case would be the same - i.e., 40bits, or 5 bytes.
So the file sizes will be the same, though the data contained is not the same.

However, a checksum (which is a hashing calculation) of the binary values of each file will be very different.
File size is simply a measure of how many bytes there are in a file and how much space the file occupies on disk.
Equal file sizes would indicate correlation, but would not otherwise tell you anything useful about the actual contents of the file - e.g., whether there is a high probability that the contents of those files are identical.

A file checksum is a number that represents a nearly unique mathematical value of the hashed sum of the contents of the file. In xplorer² it is a good guide as to uniqueness as it shows this numeric "summary" of a file's contents:
  • If the checksums of two files are different, then the files are definitely different, even though they may have the same file size.
  • However, if the checksums are equal, then this would imply a high probability that the files are identical (though this is not absolutely certain), regardless of the file sizes. If the file sizes were also equal, then this could possibly be used to augment the user's confidence level that the files were identical, but it would carry no statistical certainty that this was in fact the case.

So, going back to my points from above:
...Sorry, but I think at this point I must be missing something as I do not understand:
  • (a) Why you need to "overwrite the existing file." with the file in Source in Filter #1. It is a superfluous/redundant step. You need only to leave the Source file as-is (or delete it if it is not wanted), and leave the target file untouched.
  • (b) Why you persist in including the use of file size as a basis of comparison at all, when it would seem to be irrelevant (QED). It confuses the issue unnecessarily.

- and, unless I am mistaken:
  • Point (a) remains valid.
  • Point (b) remains valid.

Thus, what you would seem to have is a relatively straightforward and conventional backup problem requiring no special software (as you seem to think it might need).

Suggestions:
  • I had been awaiting your explaining where I was "missing" something before suggesting that you could do worse than use the excellent FreeFileSync, but @MilesAhead and @tomos have since covered that in comments above.
  • Also, since your disk space requirements for this data are not all that great, and if backup space is not a problem, then I would suggest that you avoid hastily taking the potentially irretrievable step of automating the deletion of your duplicate files and consider using the versioning feature of FreeFileSync, which would enable you to preserve all the data in an organised fashion and using customised backup rules (as suggested by @MilesAhead), and you could then take your time sifting and organising it (say) along the lines suggested by @TaoPhoenix, before deleting anything.
  • If/when you have backed everything up to backup hard drives or CDs, another really handy tool that might be of use to you could be VisualCD. You can download it here.
  • Since your most significant file metadata is apparently contained in the filename, then VisualCD would take relatively little time to scan and index the backup drives. (Full metadata would take a lot longer.) I use it this way all the time, and keep the VisualCD indexes on my laptop where they are very quickly searched, and VisualCD will even open the actual file on the archive device/media, if connected.
  • If space were an issue, then you might like to consider compressing (e.g., .ZIP) large groups of the files to see whether significant space-savings were possible, or using native Windows disk compression in NTFS.
  • You could consider dumping all the files into a database using a simple tool - e.g., (maybe) Excel, Access or GS-Base. For the latter, refer Database System GS-Base Is Easy as a Spreadsheet--And the Price is a Steal. That review is a bit old. I am playing with a newer version of GS-Base that I bought from BitsDuJour for USD9.95, with the idea of using it as a backup database for certain types of files. As a backup repository, all the files would be stored as objects in records. Having duplicate filenames in such a repository would not cause any problems.

Hope this helps or is of use, and always assuming that I have got things right.
2136
Fraud and cover-ups at the FDA?
http://www.slate.com...den_from.single.html
http://www.wired.com...ng-americans-health/
...
________________________________
I read something today that reminded me of what you posted there, so I went back and re-read your post and followed the links. Those points you referred to now look like particularly good points, in retrospect.
What I have realised is that it's not just the FDA, but also a system of other government-directed organisations that have been established as the authoritative sources of science on some aspect or other of controlling our lives - and this has implications that I probably had not fully appreciated before.
Putting those points in the context of this discussion thread, we have seen here several examples of "bad science" where research and the scientific method and the peer review process have in some way been abused/corrupted, apparently for the sake of one or a combination of financial gain (e.g., research funding, or business profit) or professional prestige, and there have been alarming - but amusing - demonstrations of how easy it can be to get ludicrously bogus research (e.g., the chocolate diet weight-loss research) published with the stamp of authority in prestigious so-called "scientific" journals, whose editors are seemingly bent on maximising readership (and revenue) rather than paying attention to establishing the veracity/validity and bona fides of the research itself.
These journals would seem to have a sort of "Never mind the thickness, feel the width." approach, where the line between fantastic, eye-grabbing journalism and bone fide research would seem to have been a pretty fine line, at times, and repeatedly crossed (QED).

There has subsequently been what looks like a belated but relatively thorough housekeeping and weeding-out of the discovered bogus/suspect research, with it being retrospectively and publicly withdrawn from publication in the journals by the publishers involved - all of which is well and good. However one hopes that the editorial staff of the publishers concerned acquire/regain their necessary healthy skepticism and don't fall asleep at the helm again. Time will tell though and I for one am not going to hold my breath, as experience indicates that unless a business process is radically changed and improved, then the quality of its outputs will have to by definition continue to be more or less of the same standard as before. (Deming et al).
That is, being "vigilant" isn't going to cut it, as that is not a process step.

Now suppose that some areas of scientific research:
  • (i) were declared to be officially the bailiwick of specific, authorised  organisations, and
  • (ii) research in those areas was conducted, peer reviewed, and published solely by/through/under the auspices of one and the same supposedly authoritative organisation, and
  • (iii) that that organisation was always a pseudo-government organisation, NALGO (Non-Aligned Government Organisation), QUANGO (Quasi-Autonomous Non-Governmental Organisation) or similar, having been set up as an organisation to which a government has devolved power at "arm's-length".

What sort of peer-reviewed research outputs could be expected to come from these pseudo-government organisations and what sort of outcomes could we expect from that research?

I'll attempt to answer that question, but first would suggest that we reflect on what US President Eisenhower said about government and science/research, and why:
"The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present – and is gravely to be regarded.
Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite."

- from Eisenhower's Farewell Address to the Nation, January 17, 1961.
______________________________

So, let's look at some examples where it seems that these supposedly public-serving organisations have in fact been serving political and/or corporate objectives and to the detriment of their public service:


1. The FDAs (Food & Drug Administrations): (add into here @Renegade's points on the FDA, above)
  • The US FDA: has apparently recently come out and retracted its longstanding sanctioning of all those manufactured and hyrdrogenated vegetable oils/fats that were supposedly such healthy alternatives to the deadly animal fats and dripping, or something. Most of those manufactured oils were apparently not what does a body good, after all. Oh, and by the way, contrary to what we said, animal fats and dripping are also really good for you! What a surprise! (NOT).
    Outcome: A huge and unknowable loss in human health/life and medicare costs for the general public in many countries where the FDA advice to ingest the toxic fats rather than the safer and non-toxic fats was adhered to. A huge win for the corporates and medical insurers. The manufactured vegetable and other oils scam effectively created an enormous worldwide market that must have netted the manufacturers billion in profits, over the years. Presumably a new scam/market will be required now.

  • The NZ FDA #1: From memory, so the details may be a bit sketchy, I recall reading some old copies of The Spectator from the '80s/'90s that showed investigative reporting that the NZ FDA had apparently succumbed to corporate pressure and allowed the label "Just Juice" to misleadingly market a heavily-sweetened juice as consisting of pure juice, or something. Though their juices contained no added sugar per se, the NZ manufacturer apparently had a surplus of apples and was able to manufacture sugar out of apples, producing an odourless, tasteless and colourless solution of high-concentration fructose (sugar), and at less or equivalent cost to buying raw sugar. They apparently wanted to mix this into their juices and label it as "with apple base" and imply or state that it was unsweetened or contained no added sugar (i.e., implying it was a "pure" juice). The product still seems to be available in supermarkets today, still with its misleading labelling re the undisclosed fructose additive. Just juice it ain't.
    Outcome: A financial win for the corporate, and a loss to the public. The FDA effectively sanctioned the hoodwinking of the consumers, denying them the right to have true and honest labelling, so they ended up buying the stuff thinking it to be a healthy and pure juice, not realising that it was unnaturally loaded with fructose.

  • The NZ FDA #2: Again, from memory, so the details may be a bit sketchy, I recall reading some old copies of The Spectator from the '80s/'90s that showed that the NZ FDA had apparently succumbed to corporate pressure and allowed Kelloggs to market their manufactured Cornflakes product as being "nutritious", though a nutritionist might be quick to point out that there is little or no nutritional value in cornflakes. They apparently claimed that by spraying-on riboflavin and iron additives, or something, the product had nutritional value. They apparently gave the NZ FDA an ultimatum - either we are allowed to label it as we want on product produced in NZ, or we take our cornflake production plant to Australia and import it from there, or something. Kelloggs Cornflakes are still sold in NZ, though I am unsure whether they are still manufactured in NZ.
    Outcome: A financial win for the corporate, and a loss to the public. The FDA effectively sanctioned the hoodwinking of the consumers, denying them the right to have true and honest labelling, so they ended up buying the cornflakes thinking it to be a naturally nutritive product. (I don't think this would pass muster in the EU though, so it may be a reflection of the immature or "Wild West" nature of consumer rights and protection standards/laws in NZ/Australasia.)

  • NZ FDA side note: Perhaps unsurprisingly, the then director of the NZ FDA - Dick Hubbard - (who I think may have also been a food nutritionist) must have recognised that the NZ FDA was unable to operate independently, because he later resigned and in 1990 founded (together with his wife Diana) the company "Hubbards" with the objectives: 1. Make Good Food, and 2. Make A Difference. Hubbards' breakfast cereals and other cereal products are the yummiest I have ever tasted and all are nutritious, and they definitely raised the bar for other cereal manufacturers. He doesn't make any cornflakes. He's a millionaire now, and he continues to make a difference.
    Outcome: A big win all round, and some superb and nourishing cereal foods for the consumer, as well as some major benefits for the many disadvantaged people Hubbard targeted for employment in his food-making factories.
    ______________________________

2. "Climate" bodies - IPCC (UN), EPA (US), DECC (UK):
The analysis of leaked emails and documents from Climategate (File: FOIA2009.zip) was important for several reasons, including:
  • (a) It was a timely wake-up call, indicating to the sleeping masses (myself included) that there was something decidedly rotten in the state of Denmark.
  • (b) It enabled anyone who wished to analyse and seek the truth and "Find out for yourself" ("Nullius in verba" per the founding motto of the now apparently somewhat discredited Royal Society).
  • (c) It enabled the analyst to discover - warts an' all - emails illustrating the extent to which scientivists were apparently engaged in deliberate stochastic lying and obfuscation in the scientific research and peer review process on the subject of MMGW (Man-Made Global Warming, now re-badged as "Climate Change™").

It subsequently became apparent that the motivation for this seeming perversion of science and statistics could be in the desire to push potentially world-changing religio-political ideologies and possibly also the desire to secure further abundant scientific research funding, and the huge profits from the government-subsidised sale of environmentally destructive and capital and land-intensive wind farm and solar cell engineering projects, all driven mostly by the founding Charter of the IPCC, which was based in large part on the de facto assumption of the thesis of MMGW.
For example, the subject was referred to as "The Cause" in correspondence between scientivists at the IPCC, Penn StateU and the UEA CRU.
As if to prove the point, I today came across what would seem to be proof of more of the same, apparently from a Swiss government minister and intended to dupe the Swiss voters: Former Swiss Minister: Okay To Lie About Climate “If It Is For The Good”… | NoTricksZone
Outcome: A political push based on an apparently so far unsubstantiated need (C02 reduction and MMGW), towards a new global taxation and transfer-pricing regime (carbon credits), with sovereignty and economic and political power being transferred to a global unelected government and monstrous bureaucracy. Chalk up a big financial win for corporations profiting from selling the new "sustainable" energy-generation systems, and for corporate sponges, unelected representatives, bureaucratic process and hordes of overpaid "charity" and policy wonks, and a huge financial loss for the taxpaying public who have funded this. A huge, elaborate and expensive charade (for the taxpayer) spanning years, with no real foreseeable benefit so far and electricity being produced by so-called "sustainable" means - e.g., wind farms and solar cell collector systems - at a cost which is apparently "Astronomical" (per Bill Gates recently), and which could never meet the existing electricity/energy demand projections, nor feasibly or cost-effectively replace the cheaper and/or longer-lived fossil-fuelled power sources, nor the hydro-electric or nuclear power sources.


3. The CDCs (Centers for Disease Control):
Now this is the thing that got me to thinking today about the abuse of science and peer review in public services for ulterior motives: Quietly, Congress extends a ban on CDC research on gun violence.
Reading the background, the proposal to do this research would seem to have been politically motivated when it was proposed before, and now, and the decision to shut it down before, and now, would also seem to have been politically motivated.
Outcome: A great big red flashing warning light.
I found the whole thing laughable, but at the same time I thought it rather frightening - the evidence of a political fight over a seemingly remorseless desire for political control, being exercised through the mechanism of the US CDC, a pseudo-government agency providing a valuable public service. This giant nation seems to be at war against itself and its Constitution, and almost everybody else, and on so many fronts.

The US CDC was presumably set up with the same objectives as CDCs in other countries - that is, to conduct studies of disease/sickness and to conduct epidemiological studies, in order to better protect public health. Thus, by no stretch of the imagination could a gun fit the definition of a disease/sickness. If guns were a suitable subject for study by the CDC, then before long there could be a study of automobiles (to reduce death from automobile accidents or to stop them being used in the act of committing crimes), and then there'd be a study of knives, and then psychological studies of (say) religious fanaticism (to reduce death by murderous religious fanatics), and pretty soon it could be thought crimes and the objective would be to stop people from thinking thoughts that were deemed politically incorrect or put them in gulags, etc. We wouldn't even need to think of a plan for this, as we could take the lessons straight from the communist manifesto. A slippery slope indeed.
______________________________

So the National Geographic is arguably spot-on where it has that moronic propaganda on the front page about "The War Against Science", except that it's the government agencies that could seem to be engaged in a war against the citizens, where Science is the main weapon, and the scientific method and the peer review process are merely tools used in achieving the objective of beating everyone into submission to the conventional elite/political wisdom. This is what Eisenhower was warning about in his prescient farewell speech. It is hostile political force, and some people (not me, you understand) might say that the US would seem to be a prime offender at it, actively pushing some other countries to go in the same direction as well, through the UN, World Bank, IMF, WTO and other international agencies and trade agreements, but I couldn't possibly comment.

Those same people might go on to say that, looking at all this from a global perspective, it doesn't seem to make too much difference whether one submits to (say) the domination of the hegemonic religio-political ideology of Islam and joins ISIS, or submits to the secular US hegemonic religio-political ideology of political and economic domination, because the choice is arguably pretty much the same - submit or die, and either way you absolutely lose your freedom - but again, I couldn't possibly comment.

Mind you, as my adoptive brother Khaled (a Muslim) would point out, if one had to make the choice between those two systems of religio-political ideology, then one could perhaps be forgiven for being inclined to choose Islam, because Islam categorically has Allah on its side and at least Islam fiercely protects its own and does not subject them to slavery.
2137
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 08, 2015, 10:22 AM »
@ednja: This is your description of your revised filters, with the Opening Post filters inserted below each in the quotes, for comparison. I have highlighted the difference in the newer Filter description:
... I would modifiy the filters as follows:
Filter #1:  If the file being moved has the same name, extension, size and content as a file already existing in the target folder, then overwrite the existing file.
Whereas the Opening Post says:
Filter #1:  If the file being moved has the same name, extension and size as a file already existing in the target folder, then overwrite the existing file.
______________________________

Filter #2:  If the file being moved has the same name and extension as a file already existing in the target folder, but the two files are different in size or have different content or both, then move the file, but keep both files.
Whereas the Opening Post says:
Filter #2:  If the file being moved has the same name and extension as a file already existing in the target folder, but the two files are different in size, then move the file, but keep both files.
______________________________

Sorry, but I think at this point I must be missing something as I do not understand:
(a) Why you need to "overwrite the existing file." with the file in Source in Filter #1. It is a superfluous/redundant step. You need only to leave the Source file as-is (or delete it if it is not wanted), and leave the target file untouched.
(b) Why you persist in including the use of file size as a basis of comparison at all, when it would seem to be irrelevant (QED). It confuses the issue unnecessarily.
2138
Find And Run Robot / Re: FARR slow searching/pauses - tips
« Last post by IainB on July 06, 2015, 08:39 AM »
@mouser: I have just installed FARR v2.226.01 Beta.
Do you you recall that I reported some time back that FARR had a relatively frequent and annoying habit of not properly coming to the foreground when it was invoked? This would repeat.

Well, it might be a coincidence, but this latest version seems to come very promptly to the foreground when invoked. Touch wood maybe you have inadvertently fixed that problem. I'll let you know if/when the problem ever returns.
2139
General Software Discussion / Re: Firefox and Cyberfox release 39.0 stable
« Last post by IainB on July 06, 2015, 03:13 AM »
It almost seems like that and extension signing are designed to break extensions.  I was going to ask who will use Firefox when the extension writing community is destroyed.  But maybe they only want the ingenue neophyte blissfully unaware of being caged.  IOW people who only know internet through phones and tablets.
They cannot possibly not know the effect.  I was toying with the idea of trying my hand at writing an extension until I started reading the thread on the Firefox extension forum about signing.  The handwriting is in the wall.
We need someone to create AnarchistFox.  The browser that lets you do whatever the hell you want whenever you want to do it.  :)
_______________________

Your comments here would seem to be ironic, in retrospect, given that Mozilla Firefox rather started out with a vision representing a white night championing the freeing of the Internet experience from the death-like grip of the corporate ad-merchants' and MS Windows/IE fascists' control, or something, and incorporated deliberate design principles so that anyone could make FF "their own" with whatever extensions/add-ons/scripts one wanted. The many Internet serfs amongst us who strongly identified with and shared such a vision could stride into the future with heads held high, their fists raised in firm defiance, secure in the knowledge that, in FF, they had a White Night - a champion and an ally. This champion offered a means and an opportunity for the serfs to secure and control their unique personal Internet experience - their Internet independence and freedom - and even to help them to "stick it to the Man" - if they so wished (and some no doubt fervently did and may still do to this day).

/Rant ON.
____________________________________
The FF browser was a mascot, a logo, and - for some - an idol. It was A Just Cause in a Golden Age for Internet User Anarchy. So it became a sort of idolatrous Religion to fight the Crusade of Internet Anarchy, and it was going to make the world "a better place" - that is, if we all conformed and became FF users just like everybody else.

So, there were FF logos of all sorts, and FF banners for websites proudly proclaiming the FF "brand" and our lurve for FF, and there were FF parties, love-fests and conferences, and FF love-ins, wife-swapping parties and orgies, and FF discussion forums, and FF Doctorates became popular in universities, and people made FF cakes. Across the world, the parents of boy and sometimes girl children conceived by FF developers and fans during those heady times would often be christened with FF-redolent Christian names, including (for example):
  • "Firefox", "Fox" or "Foxy" (in English-speaking countries),
  • "Renard" (in French-speaking countries),
  • or the more masculine "Wulf" (as a matter of preference in Teutonic areas, or out of a misunderstanding in places where they either didn't have foxes or didn't know what foxes were),
  • or "Vulpo" (for native speakers of Esperanto).

Whatever happened to that vision? Hmm, let's see:
  • 1. Gone are the encouraging but childishly amusing marketing logos of mighty Mozilla Firefox robots with rocket-drives built into their their feet, or something, conquering the Internet. Mozilla has now "come of age" and is toeing the party line and clearly not being allowed to continue its anarchistic and disruptive technology development.

  • 2. Gone is the flow of encouragement/motivation directed at the Faithful - the FF supporter community - exhorting them to greater efforts to proselytizye and to "spread the Firefox word", or to put FF banners on their websites, or whatever. That would all be pointless, idealistic nonsense now.

  • 3. Gone is the independent, anarchic uniqueness of FF, which seems to have become a semi-ubiquitous and increasingly iron-fisted vanilla product in an apparently pseudo-competitive market where all the other browsers have the same iron fist and taste, including IE, Chromium, Google Chrome, and FF's several forks. Yet MS is apparently declaring that it will be pulling IE out of the browser market? Yeah, right.

  • 4. Gone is the sense of direction from Mozilla about "how many millions of downloads of Firefox" there have been, or what the browser market share looks like. So who at Mozilla cares who is using FF and why they are using it? The answer may well be that no-one is interested since it is irrelevant. The so-called "browser wars" (possibly a feel-good mythical invention?) have now ostensibly ended, with browser development apparently being controlled in the background via cartels - manipulative groups of corporate/commercial and political and spying interests.
    The Old Media of the newspapers and TV news channels (i.e., MSM - MainStream Media) were/are similarly controlled by these cartels, and obligingly regurgitate the same indoctrination - often word-for-word. Nowhere else does this seem to be more apparent than in the USA. Old Media are now being forced to transform into the New Media of the Internet, and there has been an ongoing struggle by the old cartel(s) to establish a pre-eminent position of control over the New Media and the technologies enabling the Internet.
    The prevailing/pre-eminent cartel(s) in each area of territorial sovereignty will slowly tighten the noose around users' necks so that users will be be forced to ONLY have the collective experience, utility and financial intermediation of the Internet that the cartels choose to allow and enable. Anything else will be defined as being "illegal". This is apparently being and likely to continue to be governed mostly through State intervention/decree and regulatory bodies appointed within areas of territorial sovereignty. This is already becoming a fait accompli in many instances, having resulted from covert and overt action and collaboration between States - e,g, including Pan-European, Pan-American, Pan-Australasian and Pan-Asian efforts on SOPA, TPP, etc.

  • 5.Gone is the freedom to make and choose extensions/scripts.
    • We saw userscripts.org mysteriously taken off the air, and, as if to make certain, at about the same time as Greasemonkey was forcibly updated to a version that was apparently not backwards compatible with "older" userscripts.

    • Recently Read It Later (aka "Pocket") was silently made a mandatory component in FF. What the heck is that about if it isn't that prior FF policy and standards excluding such actions haven't been turned on their heads due to corruption for financial gain? I used to have the RIL extension anyway, but now that it has been made mandatory I am deleting the thing altogether (the extension and within about:config).

    • Recently, compulsory "Registration" was introduced unilaterally by Mozilla, which prohibited Add-Ons/Extensions which were not "Registered" by Mozilla - read "Licenced" - so all of our extensions slowly disappeared, to be belatedly brought back with "Registered" labels, and said labels being deliberately removed now so that we won't be able to tell which are which, and that will confuse us so that we won't be able to figure out which extensions have never been allowed back. This market manipulation is redolent of the Greasemonkey script called Facebook Unfriendfinder, which provided information about your account as to who had unfriended you and re-friended you, but Facebook lawyers apparently may have leaned heavily on the author and he withdrew it. It still, works, with a bit of tweaking to work around subsequent and ongoing Facebook changes.

  • 6. Gone is Mozilla's apparent political independence, with Mozilla apparently succumbing to fascism, with the man who was a primary founder and architect of the Mozilla vision being mercilessly hounded out of his newly-Board-assigned position as Mozilla's CEO', within days of taking up the position, because a minority group apparently did not want him there and so picked on his personal and long-known views against homosexual "marriage", or something, as making him out to be unfit to be in what had become a politically correct organisation - where even certain thoughts or opinions that might differ from the official collective view were apparently no longer to be allowed and indeed were a crime to be punished by being put in the stocks for public humiliation as an example of what was to be done to heretics, followed by excommunication or professional lynching, or both. Shades of The Royal Society and Prof. Eric Laithwaite.
    ____________________________________

/Rant OFF.
2140
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 03, 2015, 09:55 PM »
Dopus also does a reasonably advanced sync - and could be used as IainB suggests with xplorer2
but I dont know if sync or backup is what is required @ednja ?
_________________________

First off, I have suggested that the redundancy in Filter #1 be addressed/eliminated.
Second, whether what is required is "sync" or "backup" would probably depend on the definition of those terms.
In any event, I was not advocating either "sync" or "backup" per se at all, it was merely that the image I posted showed xplorer²'s two-pane Sync Wizard's options (filter) being used (used with other settable filters, it's a very powerful tool for comparing/amalgamating files in separate directories/media).
2141
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 03, 2015, 09:21 PM »
I'm wondering why file size matters, because wouldn't the date be off by even a few seconds if it's two different copies of a file? Even in a high speed automated "log.txt" or something updated and then aggressively backed up, do any of the options above change context if it doesn't need to know the file size (or maybe checksum, because for ex someone opens a text file and then say MS Word adds a line break it's now different.)
_______________________

The OP refers to file "name, extension and size", but file size is generally an unreliable/imprecise basis for file comparison, whereas content (checksum) is pretty definitive as a data analysis tool.
You seem to have conflated "time" with "size", and yes, "time" is also an imprecise basis for file comparison - mainly because of the different and inconsistent time-stamping standards applied by different file operations and file management tools.
Where you say "...do any of the options above change context if it doesn't need to know the file size (or maybe checksum, because for ex someone opens a text file and then say MS Word adds a line break it's now different.)" my response would be that the OP apparently has a redundant requirement in Filter #1 (QED) and that my comments would seem to indicate that "size" is irrelevant (QED) and "checksum" is an imperative for file validation when housekeeping in cases such as this. It's not my opinion, it's just Computer Operations Housekeeping Best Practice 101 and is typically the sort of thing that would be drilled into you in programmer training if you worked for a computer company. It is also strongly justified as being a forensic and prudent measure in its own right.
And yes, the checksum comparison would show a difference between file A and file B, where file B was the result where (say) "...someone opens a text file and then say MS Word adds a line break it's now different.", but that's not the case here. In this case, A and B are apparently identical in ""name, extension and size" and presumed identical, but we know that size is unreliable and so we need to verify whether they are identical in fact, and the only valid test we have there is whether they are the same in content (checksum). Sure, a difference could be attributable to (say)
  • (a) file corruption on write, or after having been written (e.g., if there had been a disk surface or other media degradation),
  • (b) a virus infection update,
  • (c) an MS Word update of the type you describe.
- but in this case I think there is an implicit assumption that the possibilities for (b) and/or (c) would have been eliminated before this backup amalgamation/rationalisation stage. However, if that is an invalid assumption, then the problem expands to one of entire database verification and validation, prior to going further with backup amalgamation/rationalisation. You have to start with clean source data otherwise you can forget about backups as they become largely irrelevant.
My understanding from the OP is that the files in Source are effectively taken as being the clean master version of the files in Filter #1, and any duplicate files in Target are some kind of copy (e.g., backup copy) of the master files.
2142
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 03, 2015, 07:58 PM »
"Size" could be a potentially unreliable comparison, so I would recommend using "Content" instead.
What could be useful, therefore, would be to verify whether files in the source directory with identical name/extension to files in the target directory were actually identical in content (~size), and only overwrite/copy (one way or the other) if the content were different and perhaps depending on the date. I would use a file checksum comparison between the two.
__________________________
I think there are likely duplicate file finders that would get rid of the unwanted sources by doing a hash if the sizes matched.  I just don't know the names of any.  The "keep both" case is kind of a pain.  I haven't looked at dupe file utilities to see exactly what features are available.
__________________________

Yes, but I was not advocating a "keep both" policy. I can see what you mean above, but my point was purely about verification of data BEFORE the irretrievable operation of the duplicate "master" in Source being written over the same file in Target and then deleted from Source. This would be regardless of what happened to the Source master file later - e.g., if the files had identical content, then the duplicate in Target would remain untouched and the master in Source could remain untouched or simply be deleted from Source if housekeeping no longer required it to remain there.

Verification is essential:[ If the files had been identified as "identical files" in terms of filename/date/size, they still might not be identical, in fact, and so a content (checksum) comparison could verify that one way or the other.
For example, a corrupted file would give a different checksum, and if you got a different checksum in one file, then you would need to inspect both files to establish which was the uncorrupted one, and then use that as the "master".
If the presumed "identical" Source and Target files had identical content, then "moving" the Source file to the Target (per Filter #1 in the OP) would be a redundant (unnecessary) and ill-advised action, for two reasons:
  • (a) efficiency, resource utilisation and workload: it would unnecessarily use computing resources (read-write) and add time to the operation for apparently no good reason whatsoever.
  • (b) risk and data validation workload: if the two files have been established as being identical in content (checksum), and one is then overwritten by the other, then it would introduce the potential risk of a "bad" or corrupted write over an uncorrupted file (why would you do that?), and to avoid that would necessitate using a robust and unnecessary/inefficient (QED) write - "robust meaning "read after write" - thus using more computer resources and adding time to the process.
2143
General Software Discussion / Re: Looking for Software with this feature
« Last post by IainB on July 03, 2015, 04:33 AM »
I was following up on @MilesAhead's comment which had caught my interest as I had not previously tried out RichCopy:
I would take a look at RichCopy
It was written by a Microsoft Employee so it likely has rules or filters xcopy and robocopy do not.
______________________________

RichCopy looked to have some uniquely useful functionality, so I have downloaded and installed it - thanks for the tip!    :Thmbsup:

Now I might be missing something here, so I apologise in advance if I have it wrong, but since I had only skimmed over the OP and had not actually read it fully, I thought I should at least try and understand it, so I read about the requirement for the two Filters:

...
  • Filter #1:  If the file being moved has the same name, extension and size as a file already existing in the target folder, then overwrite the existing file.

  • Filter #2:  If the file being moved has the same name and extension as a file already existing in the target folder, but the two files are different in size, then move the file, but keep both files.

______________________________

On reading the OP, it occurred to me that the requirement in Filter #1 would seem to be redundant, since, if a file about to be moved had the same name, extension and size as a file already existing in the target folder, then there would be no need to move it and overwrite the existing file in the target, and thus you would leave it as-is.

"Size" could be a potentially unreliable comparison, so I would recommend using "Content" instead.
What could be useful, therefore, would be to verify whether files in the source directory with identical name/extension to files in the target directory were actually identical in content (~size), and only overwrite/copy (one way or the other) if the content were different and perhaps depending on the date. I would use a file checksum comparison between the two.
Normally this would seem to be a kinda paranoid check, but I actually do it when verifying my archived (backup) files, and it's a piece of cake to do it using xplorer². In the example below, I've just used two panes showing views of two folders - Source and Target - but xplorer² could enable the user to run this verification whilst syncing nested Source directories (plural) and the corresponding nested Target directories, in the LHS and RHS panes, respectively. This would only apply where the directory trees were identical. You could also use flat files to get a view of the scale of the overall problem.

xplorer² - unique file differences comparison.png


The above also gives you the files for Filter #2, so that you can then copy/move all those (already auto-selected) in the Source folder into the Target folder, keeping the names in the Target, but automatically incrementing by +1 for the newly-copied/moved files, so nothing in the Target folder will have been overwritten/destroyed. However, it would probably be preferable to use a backup tool (e.g. something with versioning, like FreeFileSync) for this, so that the Target file name remained unchanged, and the older version was moved to a version folder.
2144
General Software Discussion / Re: how to take a very long screenshot
« Last post by IainB on July 02, 2015, 04:08 PM »
The system Tree command might be useful:

Tree - listing in Power Shell.png
2145
Living Room / Re: Reader's Corner - The amazing Internet Archive
« Last post by IainB on June 28, 2015, 11:07 PM »
The Archive.org website has been transformed/overhauled.
Now there are even more good reasons for being a member/user of the Internet Archive, including:

- but be aware of the Internet Archive's Terms of Use, Privacy Policy, and Copyright Policy

(These links have been added to/inserted in the opening post.)
2146
Living Room / Re: Be prepared against ransomware viruses..
« Last post by IainB on June 27, 2015, 06:59 PM »
In the DOS OS, I recall using an excellent file manager/explorer called Lotus Magellan. From memory, one of the functions it had which I tried out but rarely used was to calculate and record the CRC (Cyclic Redundancy Check) value for important files that you wanted to preserve. You could then periodically run a check to see whether the CRC value had changed (i.e., if the file contents had been changed).

In a modern OS, in the case of a virus that encrypts a file but leaves the file name/extension unchanged, you could have a report that tells you when specified data filenames/types have the CRC (or other checksum) changed.
In the case of a virus that encrypts a file and changes the filename/extension, you could have a report that tells you when the old file name/extension is changed or if it "disappears" (i.e., is renamed in some way or deleted).

Some kind of monitor/logging/warning like that seems like it might be useful for data file security. I don't know whether that is a common practice though. For example, the OS can object strongly if specific system file types are touched in any way, so it might be happening at a system-file level.
2147
Living Room / Re: Be prepared against ransomware viruses..
« Last post by IainB on June 27, 2015, 06:39 PM »
@mouser: Are you able to answer this? It would be interesting to know what defences the virus had got through.
@mouser: What virus and/or malware protection did your relative have on his/her PC?
2148
UPDATE: 2015-06-28 0008hrs: IsoBuster v3.6 released (2015-06-19).
_________________________________

Release notes: (Copied below sans embedded hyperlinks/images.)
_________________________________
IsoBuster 3.6 Released!       June 19, 2015

IsoBuster 3.6 was released today and it features a ton of new functionality, improvements and has some important bug fixes as well. IsoBuster 3.6 is a great update again and if you're running a 3.x version you should definitely (and freely) upgrade to this new version! If you have a 1.x or 2.x license you can still upgrade with a nice discount!

Here's a list of all the goodies:

Changes / New:
    Support for the Linux EXT file system
    Support for Rimage mastered CD/DVD discs with manifest file
    Support for Nintendo GameCube file system
    Support for GEMDOS / Atari - ST FAT12-16 variant
    Support for High Sierra on CD-ROM (the predecessor to ISO9660)
    Implemented internal device caching, especially used during File System recognition. Many file systems start from similar addresses.
     Caching avoids having to re-read blocks for every file-system that is checked
    Detect if the Nintendo Wii file system is present and show an icon for it (*)
    Detect if the Linux RomFS file system is present and show an icon for it (*)
    Detect if the Unix/Linux JFS file system is present and show an icon for it (*)
    Detect if the Unix/Linux ZFS file system is present and show an icon for it (*)
    Detect if the Unix/Linux Minix file system is present and show an icon for it (*)
    Detect if the Linux BtrFS file system is present and show an icon for it (*)
    Detect if the Linux SquashFS file system is present and show an icon for it (*)
    Detect if the Linux CramFS file system is present and show an icon for it (*)
    Detect if the Linux BeFS (BFS) file system is present and show an icon for it (*)
    Detect if the Microsoft ReFS file system is present and show an icon for it (*)
    New and much faster way to find deleted files and folders in an NTFS file system

(*) Full exploration of this file system is not implemented but now an investigator can see if it is present

Improvements:
    Extra tests to make sure a child folder doesn't have subfolders that are a parent folder, creating circular links, in buggy or
     recovered file systems
    Added a 'Paranoid mode' when creating managing IBP/IBQ image files, to make sure all data is flushed to the destination
     without system caching, and structures are updated regularly
    Allow to complete a managed image file from media of which the layout doesn't fully match, but the risk is manageable [Professional]
    Allow to complete a managed image file from another non-managed image file [Professional]
    Improved speed when updating the IBP managed image file
    Added various extraction type switches via the command line: /ET: A, IBP, WAV, RAW, R2U, RUN, DLL
    Added new file system switches via the command line: /EF: see the various newly implemented file systems
    Added Pinnacle Studio mastered DVDs to the IFO/VOB recognition sequence ("PCLE UDFLib")
    Added ability (via right mouse click) to see the properties-window-text as text in a memo field (for easy copy and paste)
    Support for underscores in function names in the libewf.dll, so that Borland bcc32 built dlls can be used as well
    Always display FileName:StreamName, with or without [ADS] appended, for NTFS Alternate Data Streams
    Removed registration dialog nag when doing a surface scan on BD media
    Improvements in the Extract From-To functionality, dialog and warnings
    Make sure testing for encrypted partitions only happens once, not every time the visual node is created (e.g. when switching
     devices in GUI)
    Do not read extra blocks to test for partition encryption if there are enough cached blocks
    Updated the 'Agent' string when doing an online query, to check for a new version, to be more compatible with modern servers
     and systems
    Reverse the order of AVDP parsing, LBA 512 first rather than 256 first, in case of a CeQuadrat made UDF disc, to deal with
     CeQuadrat UDF bugs
    Get the proper volume name for CeQuadrat made UDF CD-R discs
    After an image file has been made, save its filename to the recently opened image files, so that it can be opened immediately
     from the recent image files' list
    Improved .GI image file interpretation, specifically improvements in finding the header size
    Show files and folders with the System property in another color
    Show special files, file entries that are not used in the classical way by Unix/Linux file systems, in another color
    Show Windows overlay icons and add the shortcut overlay icon to EXT symbolic link files
    Added checkbox to options to uncheck using CD-Text in filenames, when audio tracks are extracted
    The file-exists dialog now also allows to auto-rename a file, instead of over-writing or ignoring the file (non-Windows file systems
     allow duplicate but caps-sensitive names in the same folder)
    New dialog to auto-rename filenames that are illegal in Windows but OK in other non-Windows file systems
    Auto-rename folders, during extraction, when they contain illegal Windows-filename characters or a Windows-reserved file/folder name.
    Show Endianess in UFS, XFS, ISO and SquashFS and other File System properties
    Improvements in drag&drop functionality and the use of the temporary folder. Better clean-up afterwards
    Possibility to always get the RETRY SELECT ABORT error dialog (instead of RETRY OMIT ABORT) on file extraction
    Various improvements, changes and re-writes in the core code / engine, as this is a living project and to deal with the ever=growing
     new functionality
    Various GUI improvements

Fixes:
    Fixed GUI issue that caused incorrect values to be displayed in certain error messages
    Fixed possible hang when EWF image files are loaded
    Fixed that sometimes the sanitizing part after finding missing files and folders on a FAT volume could take 'forever', due to
     bad FAT records
    Fixed data-corruption issue, introduced in 3.5, while extracting files with more than 10 extents (fragments would be extracted
     in the wrong order)
    Fixed it so that when an IBP/IBQ is made twice in a row from the same media (without a refresh) the bitmap is still fully written
     out to the IBP
    Avoid exception error on bad IBP without (enough) bitmap data (rare test case)
    Fixed extraction of named streams of an NTFS folder (not file)


Download IsoBuster 3.6 here.

Please tell people about it, like it in facebook, share it via facebook or via twitter, post it on forums etc. Stuff like that is really appreciated. Start with clicking the "Recommend" button below if you're on Facebook.

Peter Van Hove,
Founder and CEO
_________________________________
2149
Living Room / Re: Be prepared against ransomware viruses..
« Last post by IainB on June 27, 2015, 07:05 AM »
@mouser: What virus and/or malware protection did your relative have on his/her PC?
2150
DC Gamer Club / Re: Microsoft Tinker robot puzzle game - download for free
« Last post by IainB on June 24, 2015, 12:59 PM »
The game is really good on Vista Ultimate!

On Win 8.1-64 PRO, however...

Tinker robot puzzle game - 01 problems.png


Bother.
Pages: prev1 ... 81 82 83 84 85 [86] 87 88 89 90 91 ... 264next