topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Thursday November 13, 2025, 10:47 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Recent Posts

Pages: prev1 ... 176 177 178 179 180 [181] 182 183 184 185 186 ... 264next
4501
Living Room / Re: General av and anti-malware discussion
« Last post by IainB on November 05, 2012, 10:29 PM »
I used use ERUNT on XP - but according to the FAQs page, it will only work if uac is off...
I didn't know that. I have switched UNC off anyway.

On the subject of malware:
If you are interested in how hijack trojans and botnets can be built, there's a really interesting blog post at the Malwarebyes blog: Citadel: a cyber-criminal’s ultimate weapon?
It describes how to set up and operate CITADEL - a "crimekit" (a tool to develop and implement a cybercriminal botnet) - to do things such as, for example infect other PCs and gather data or launch hijack trojans.
It then covers how MBAM blocks a lot of these nasties, but makes the point that user caution is still advisable, as the technology is becoming increasingly sophisticated. Apparently things like Webinject phishing popups cannot always be detected/blocked, though I think your browser might be able to do something to block spurious third party popups.
4502
Mini-Reviews by Members / Re: Malwarebytes FREE and PRO - Mini-Review.
« Last post by IainB on November 05, 2012, 10:23 PM »
If you are interested in how hijack trojans and botnets can be built, there's a really interesting blog post at the Malwarebytes blog: Citadel: a cyber-criminal’s ultimate weapon?
It describes how to set up and operate CITADEL - a "crimekit" (a tool to develop and implement a cybercriminal botnet) - to do things such as, for example infect other PCs and gather data or launch hijack trojans.
It then covers how MBAM blocks a lot of these nasties, but makes the point that user caution is still advisable, as the technology is becoming increasingly sophisticated. Apparently things like Webinject phishing popups cannot always be detected/blocked, though I think your browser might be able to do something to block spurious third party popups.
4503
Living Room / Re: Hurricane Sandy Discussion Thread
« Last post by IainB on November 05, 2012, 07:23 PM »
...Meanwhile, goodwill from regular people is rejected at nursing homes...
Yes, but that's not the point. The point seems to be that the State (in the shape of FEMA) will provide disaster relief processes, and the relief itself, or whatever is required in such circumstances.
Good will  - e.g., (say) gifts of tea, biscuits and sympathy - would thus be unnecessary and is to be discouraged. It is not a part of the process and will only confuse the heck out of an otherwise orderly management of things in the midst of chaos.
If the disaster relief - e.g., food - arrived too late such that some people starve to death or become ill before it arrives, then that would be unfortunate.
EDIT 2012-11-07: [/sarc]
4504
Living Room / Re: Files aren’t property, says US government
« Last post by IainB on November 05, 2012, 07:09 PM »
@40hz: Yes of course, but you see I was not excluding the terminology in the opening post.
If you start off with fuzzy terms, you generally end up with fuzzy thinking.

Separately, this seemed relevant: http://imgs.xkcd.com.../infrastructures.png
4505
Living Room / Re: Files aren’t property, says US government
« Last post by IainB on November 05, 2012, 05:26 PM »
This seems rather confuzzling to me. It might even all be leading to a good thing, depending on how you looked at it or defined things.
I think a careful definition of all terms used might be in order. Things could look a little clearer then.
4506
Living Room / Re: Reader's Corner - The Library of Utopia
« Last post by IainB on November 05, 2012, 02:21 PM »
I am a keen and long-standing user of English dictionaries. My preferred dictionary is the Oxford series, and they have been my preferred digital dictionaries too. For years, I used to rely on a DOS-based version of the Concise Oxford Dictionary, which also worked under Windows. Then I was given an upgrade to what was called "The New Shorter Oxford Dictionary", but the version I have doesn't work under Win7-64, so I don't have a client-based dictionary at present. I miss that, and find online dictionaries inconvenient to use, kludgy, slow and seemingly deliberately constrained.
However, the advent of digital dictionaries and other reference works has opened up a brighter potential future for them and us - a potential that did not really exist whilst they had to rest within the constraints and limitations of just being paper-based. Because of this, there is a natural technology-enabled trend to digitise reference works.
Thus I was not too surprised to read in goodEreader that: Macmillan Dictionary To Be Strictly Digital
(Copied below sans embedded hyperlinks.)
Macmillan Dictionary To Be Strictly Digital
By Mercy Pilkington - 2012-11-05

Macmillan announced today that it will be ending its print edition of the popular dictionary and opting for an online-only format, beginning next year. The new digital dictionary will be found at Macmillan Dictionary Online will be the new home of future updates to the dictionary.

Editor-in-chief Michael Rundell told Charlotte Williams of The Bookseller, “The traditional book format is very limiting for any kind of reference work. Books are out of date as soon as they’re printed, and the space constraints they impose often compromise our goals of clarity and completeness. There is so much more we can do for our users in digital media.”

Rundell basis for the announcement was that digital lends itself better to the kind of updating that dictionaries require. Rather than waiting several years for sales of the outdated dictionary to taper off enough to justify the expense of publishing another edition, the publisher can add new words to the online dictionary as they become recognized.

This news followed the announcement from HarperCollins earlier this year that it will be releasing its iconic Collins World Atlas as a digital edition for tablets, which allows the publisher to take advantage of the interactive benefits that tablets allow. The online destination for the Macmillan dictionary will continue to be found at MacmillanDictionary.com, along with other features that the publisher hosts at the site.
4507
Yes, I thought that indicated a lesson for us all - if we needed it.
4508
Bulgarian banks show us how Internet censorship should work: Bulgarian Banks Try To Silence Web Site That Called Them 'Bad Apples'
4509
Looks pretty confusing in the UK too: Any Hint Of Evidence Based Copyright In The UK Seen As Nefarous Plot By Parliamentary Copyright Maximalists

No matter. I wouldn't be surprised if Internet freedoms all went down the plughole, same as in the US, only quicker.
4510
General Software Discussion / Re: organize data for research
« Last post by IainB on November 05, 2012, 01:11 AM »
I see a problem with the amount of data they can handle
that's why a sql-featured software looks more sensible
Very droll.
4513
Another 'roach: The Internet Radio Fairness Act: Revamping the Online Radio Marketplace
I hadn't known this was in the works.
Royalties for online radio and other digital music services are a prominent topic for today’s recorded music industry, and the discussion has only grown with the recent introduction of the Internet Radio Fairness Act in the House and Senate. IRFA aims to revamp the parts of the Copyright Act that create licenses for online radio services to pay for transmitting sound recordings to their users. More specifically, IRFA would change the standard by which online radio royalty rates are set, alter the qualifications and appointment procedures for the Copyright Royalty Judges, and make several more changes to the process of setting online radio royalties.

What could possibly go wrong?
4514
General Software Discussion / Re: organize data for research
« Last post by IainB on November 04, 2012, 03:49 PM »
I'd echo @40hz, @Shades and @barney:
+ 1 for MS Office and in particular OneNote.

Suggestions:
4515
Another "I thought this was ajoke until I read it" moment:
Anti-Piracy Group Threatens To Sue ISPs Over TV Show ‘Piracy’
enigmax, November 2, 2012

CIAPC, the anti-piracy group that has successfully forced ISPs in Finland to block The Pirate Bay, has threatened to sue the ISPs themselves over alleged TV show piracy. Local ISPs such as Elisa and TeliaSonera offer cloud services where their customers can store TV shows for later viewing over the Internet. CIAPC says the services fall outside the scope of private copying “fair use” and therefore require a license to operate legally. The ISPs are ignoring demands to shut down the services and now face legal action.

For decades TV companies lived in the moment, transmitting TV shows at a certain time and date and expecting their customers to adapt to their predetermined schedules. Be around when the show airs, be around for the repeat, or miss it forever, the business model used to dictate.

Technologies such as VHS and more recently home hard disc recorders went some way to bridging the accessibility gap but these days customers increasingly want everything “on demand”, at a time and place of their choosing, not one dictated by a TV company.

To fill this gap in the market, some ISPs such as Elisa and TeliaSonera in Finland are offering their subscribers personal cloud storage. As a TV show is aired it is recorded to the customer’s cloud account, ready to be watched over the Internet at a more convenient time.

The ISPs and their subscribers appear to be happy with the convenience of the services but perhaps unsurprisingly they are now coming under attack from rightsholders.

CIAPC, the anti-piracy group that successfully forced ISPs such as Elisa, TeliaSonera and DNA (around 80% of the Finnish Internet market) to block The Pirate Bay, insists these services are illegal and should be shut down.

“Storage services for TV shows are currently offered by around twenty companies, including major Internet service providers such as Elisa and TeliaSonera,” CIAPC explain. “None of the companies have licenses for the services. This is significant, because the issue concerns around 100 million euros worth of commercial services.”

CIAPC say they wrote to the companies advising them that their services breach copyright law and ordering them to be shut down, but thus far the warnings have gone unheeded. So this week CIAPC reiterated their threats that if the services remain operational, legal action will follow.

“None of the service providers has complied with the requirement of the ban. It appears that a legal solution needs to be considered,” says CIAPC managing director Antti Kotilainen.

The timing of the threats appears to be linked to an announcement last week that the operators of TVkaista, a company offering similar services, had been charged for illegally offering the content of several TV companies without permission.

TVkaista’s CEO and technical director are accused of copyright and intellectual property offenses plus aggravated fraud. The company’s legal adviser is charged with criminal copyright offenses and copyright fraud.

The accused all protest their innocence. They insist that their service is legal under current law which grants their customers a fair use exception for private copying of TV shows for personal use.

The service offered by TVkaista is, however, slightly different to that being offered by Elisa and TeliaSonera. TVkaista records all programs and stores them for a few weeks whether customers ask for them or not. The other services only record TV shows on demand.

CIAPC say that the Copyright Act only permits users to save content such as TV shows, movies and music locally within the home, and these cloud services don’t fit that description.

None of the ISPs are expected to give in without a fight.

And @Renegade, yes, I do see that a lot of what David Icke says makes sense. This "ISP Piracy" nonsense is an example of what he was on about - Fascism. The rightsholders seem to want to have us all in a straightjacket where we get force-fed exactly what they want to feed us, when they want to, and we will have to pay them for the privilege every time.
4516
It sometimes seems as though quite a lot of the issues affecting Internet freedoms tend to be largely ignored or "under-reported" by the MSM (MainStream Media), so that we interested parties (Internet users) actually only find out about things at the last minute.
Now why might that be?
...
I didn't try to answer my own question, but I have just read where Rick Falkvinge answers it for me - and very cogently, it seems: Why SOPA-supporting news networks don't mention SOPA.
He starts with an arguably valid assumption, which he then substantiates, that there are relatively high levels of functional illiteracy in some populations, and gives the estimate of being (typically) 50% in the industrialized parts of the world, but Italy being higher - about %57% (i.e., only 43% functionally literate). He then goes on to suggest that that functional illiteracy was taken advantage of.

He writes of the Berlusconi fiasco: (my emphasis)
On these six [Italian] television networks, the referendum was simply not mentioned. Not once. Deemed not newsworthy.
At the end of the day, this enraged the Italian people enough to bump voter turnout over 50% anyway, and the referendum passed. Very shortly thereafter, having had his immunity revoked, Berlusconi stepped down.
Are we starting to see parallels to the SOPA blackout yet?

Conclusion
If you control what other people know, if you control the village newswell, then you control the entire village. The Catholic Church was in this privileged position before the printing press (which is also why they demanded harsher and harsher penalties — up to and including the death penalty — for unauthorized copying of knowledge in their time).
The one thing that can threaten TV news networks is the Internet and the ability for people to communicate directly, bypassing the judgment of the now-famous 1% to determine what knowledge befits the masses. We learn from history that all such power is always used to maintain and strengthen itself first. So, SOPA basically kills that ability of the everyday person to bypass the 1%.

Therefore, it is in the economic and political interest of today’s newswells to kill a strategic threat to their privileged position, and to act just like Berlusconi did in Italy: to actively not bring the topic up onto people’s radar.

In other words, Corporate United States is just as corrupt as Berlusconi’s Italy was, and is acting just like the Catholic Church did when they tried to kill the printing press.

Worth reading the whole post.
4517
Living Room / Re: History of CP/M
« Last post by IainB on November 03, 2012, 01:23 AM »
Interesting.
Very similar and, mostly but not always identical, to this: http://bulote.iteye.com/blog/1387331

It's essentially the same text, but with some different bits added in:
Here is the relevant bit (cleaned up using cleaner.exe):
Spoiler
The history of CP/M  (version 2)
--------------------
Captured 2012-11-04 0716hrs from Firefox [firefox.exe]:
http://bulote.iteye.com/blog/1387331
--------------------

Chapter 9 The Development of Computer Operating System
642
9.1 The Operating System
An operating system is a program that runs on a computer to simplify the use of the computer for the user.  The operating system manages the use of peripheral devices such as printers, monitors and keyboards.  In addition the operating system will run other programs and display the results.  In order to carry out these functions the operating system has to require a systematic structure for the inputs and outputs; there is a definite structure to files and there is a systematic way in which the files are stored on the data storage devices.  Without an operating system a computer is largely an unresponsive hunk of metal and wires.

643
Although now the concept of an operating system appears to be a natural and obvious one, operating systems evolved over a considerable period of time.
The first electronic computers were "hardwired" to carryout systematic computations.  Initially the computations were for ballistics table.  The user would wire direct connections between the various components of the computer through a plug board.
When the computations were finished the next user would have to pull out the wires and rewire for the next set of computations.  This was monumentally cumbersome by today's standard but a marvelous advance in speed and accuracy over hand computations with pencil and paper.

644
In the late 1960's M.I.T.  had a time sharing operating system called MULTICS, the name indicating it was a multiple user systems.  Ken Thompson was working at Bell Labs in New Jersey and was given the use of a PDP-7 minicomputer.  He decided to create an operating system for the minicomputer for the convenience it provided even though there would be only one user.
Initially he called this operating system UNICS in analogy with MULTICS but later changed the spelling to UNIX.  At the same time Dennis Ritchie was involved in the creation of the programming language "C," so named because it was modeled on the programming language developed in Britain called "B." The collaboration between Ken Thompson and Dennis Ritchie has been quite fruitful over the years.  UNIX and C have also been closely linked.

645
9.2 Multics
In 1964, following implementation of the Compatible Time-Sharing System (CTSS) serious planning began on the development of a new computer system specifically organized as a prototype of a computer utility.  The plans and aspirations for this system, called Multics (for Multiplexed Information and Computing Service), were described in a set of six papers presented at the 1965 Fall Joint Computer Conference.  The development of the system was undertaken as a cooperative effort involving the Bell Telephone Laboratories (from 1965 to 1969), the computer department of the General Electric Company, and Project MAC of M.I.T.  Implicit in the 1965 papers was the expectation that there should be a later examination of the development effort.

646
From the present vantage point, however, it is clear that a definitive examination cannot be presented in a single paper.  As a result, the present paper discusses only some of the many possible topics.  First we review the goals, history and current status of the Multics project.  This review is followed by a brief description of the appearance of the Multics system to its various classes of users.  Finally several topics arc given which represent some of the research insights which have come out of the development activities.  This organization has been chosen in order to emphasize those aspects of software systems having the goals of a computer utility which we feel to be of special interest.

647
9.3 UNIX
* UNIX was an important innovation in computer.  It is awkward but the computer professionals were perfectly willing to tolerate its difficulties in order to get the power it gave them access to.  UNIX's shortcomings were not considered notable at the time.
The concept of user-friendly software came a decade later.  UNIX users were more concerned that something could be achieved at all rather than whether it required use of non-mnemonic commands.
* The use of UNIX spread around the country and initially Bell Labs gave it away free.  Later Bell Labs realized that UNIX had commercial potential and arranged for the marketing of it.

648
Background
* The name "Unix" was intended as a pun on Multics (and was written "Unics" at first, for UNiplexed Information and Computing System).
* For the first 10 years, Unix development was essentially confined to Bell Labs and most scripting related work was also done in NJ.  The initial versions of Unix were labeled "Version n" or "Nth Edition" (of the manuals)
and some milestones in shell history are directly related to particular Unix releases.  Major early Unix implementation were for DEC's PDP-11 (16 bits) which was so tiny by today's hardware standards (typical configuration were limited to 128K memory, 2.4M disc, 64K per-process limit (inc the kernel)) and similar configurations can be found only in palmtop computers and top electronic watches.  The fact that they managed to created pretty powerful shells for such a computer is nothing but simply amazing and attests tremendous ingenuity of early creators of Unix extremely well.

649
For computer science at Bell Laboratories, the period 1968-1969 was somewhat unsettled.  The main reason for this was the slow, though clearly inevitable, withdrawal of the Labs from the Multics project.  To the Labs computing community as a whole, the problem was the increasing obviousness of the failure of Multics to deliver promptly any sort of usable system, let alone the panacea envisioned earlier.  For much of this time, the Murray Hill Computer Center was also running a costly GE 645 machine that inadequately simulated the GE 635.
Another shake-up that occurred during this period was the organizational separation of computing services and computing research.

650
* Thompson is really the guy who is primarily attributed with developing UNIX.  He's an employee of AT&T-Bell Labs at the time-and still is.  Dennis Ritchie was the co-developer.  It was really those two guys working together who developed UNIX.
* Borne wrote the Borne shell (SH), Korn wrote the Korn shell (KSH).  Steve Johnson was very involved in writing a lot of the early utilities associated with UNIX.  Kernighan was involved in various utilities, but was primarily involved in the C Language with Ritchie-as was Plauger, also a C language guy.
Plauger wrote the first commercial C compiler.
Interestingly enough, all these guys are still out there doing related things in the UNIX business.

651
In the case of UNIX, the stage was set by events going back at least as far as 1945.  There were four or five things that happened over a period of years that made it possible for the whole UNIX thing to happen by the grass roots method that it did.

652
In 1945, AT&T was involved with an antitrust case with the federal government.  The federal government felt that AT&T was monopolistic, so they pushed them in 1956 into a consent decree in which AT&T agreed to make all it's patent technology licensable to the public.  They were also restricted to the communications business.  As a result, they couldn't be in the computer business.  This big event is the reason AT&T never commercialized UNIX along the way-it wasn't allowed to.

653
* One of the other significant events that was taking place at this same time was a project going on at MIT called Project MAC (Multiple Access Computers).  They were doing research on time-sharing systems, trying to allow multiple users to interactively use a computer system.  So time-sharing and multi-programming--all that multi-user stuff--is really evolving and taking place around 1963 and in the early '60's.
* So AT&T, GE, and IBM formed a partnership with MIT and Project MAC to try to develop a time-sharing operating system called MULTIX.  MULTIZ was meant to be a large multi-user system, but it turned out that this large multi-user system could only support a total of one or two users interactively which isn't exactly what IBM and AT&T had in mind when they started out on the project.  (GE ultimately was able to turn MULTIX into a viable commercial product.)

654
* Not too long after that, in 1969, AT&T Bell Labs said enough is enough, we‘re pulling out of this Project MAC sinkhole.  Around that same time, these two guys, Thompson and Ritchie, were finishing up some education requirements.  Here’s another Berkeley connection: Ken Thompson got a Masters Degree in Electrical Engineering from nowhere else but Berkeley so that‘s the link back to why the software ends up back at Berkeley?
* Dennis Ritchie was working and finishing up a degree in Mathematics at Harvard, but he ended up without a degree, without his Ph.D., decided to bag it and just go to work.  So Thompson and Ritchie, who go to work for Bell Labs thinking they're going to work on this fantastic new operating system called MULTIX.

655
* They're all fired up and excited and do in fact spend a little time on it but within months of when they get there, the plug gets pulled and they don't have a project.  Well, no problem, because Thompson figures he could have done it better himself anyway, which is in fact the case.  So he says, I'm going to write a multiuser system.  One of the main motivations for Thompson embarking on this project was that he needed a decent operating system to run his game called Space Travel on, so instead of optimizing Space Travel he decided to write a new operating system.  He jumped right on it.

656
* There was a one month period, with all the extra time he could really focus on this thing, he basically wrote UNIX in one month.  He wrote a kernel, a shell, a file handling system, and one of the other utility sets and had a working operating system.  He did this on a 4K machine with 18-bit words and that's what UNIX ran on originally.  You hear the folklore of Bill Gates writing his 4K basic compiler, well Thompson wrote a 4K operating system--a 4K multi-user operating system--which is pretty impressive.

657 Value
* UNIX is the most innovative, influential, operating system in the history of computing.  And it really is.  If you look at all the other operating systems, there are many ideas that are derived from UNIX.  Look at the DOS commands.  DOS took baby elements out of UNIX--it's just ideas completely extracted from UNIX.
* The original version of UNIX was written in PDP assembly language.  In 1972 it was rewritten in a language called C, which is another fundamental breakthrough in the whole process--they developed this new programming language just so they could write the operating system in it.  So UNIX was, one of the first, operating systems written in a high level language.

658
9.4 CP/M
* With the concept of operating system widely popularized it was standard practice to develop an operating system for each new line of computers.  About this time the personal computer was developed.
* Gary Kildall of the Naval Postgraduate School in Monterey, California acquired one of the early personal computers and he immediately proceeded to develop an operating system for it.
He called the operating system CP/M, for Control Processor Monitor.  It was the first operating system for a personal computer.

Background
* •In the beginning, there was CP/M.  As the first easily configurable standard operating system for micros based on Intel's then flagship the 8080, this small but effective system became the MS-DOS of its day.  With its logical, simple, and open architecture it captured the hearts of legions of amateur systems hackers the world over; so much so that even in the 1990's some diehards have refused to surrender entirely to the overwhelming dominance of DOS/Windows.  It also powered thousands of microcomputer based business systems, often to the frustration of its users who didn't care about its internals but hated dealing with its arcane command line syntax.
* CP/M was developed on Intel's 8080 emulator under DEC's TOPS-10 operating system, so naturally many parts of CP/M were inspired by it, including the eight character filenames with a three-character extension that every MS-DOS/Windows 3.X user still lives with today.

660
* "Necessity is the mother of invention" the old saying goes.  And its true; but as we all know it takes two to make a baby and in the case of CP/M the father was a man named Gary Kildall, who in 1975 was working as a consultant to Intel.

661
* Kildall's task at Intel that year was to design and develop a language called PL/M for the 8080 chip, to be used as a systems development language.  At the time, the chips themselves barely existed and Intel was just then starting to design a computer system that used the
8080.  The plan was for Gary to use the 8080 emulator Intel had running on their big PDP-10 minicomputer, but he preferred to work directly on the 8080 itself, in part because by working on his own machine at home he could avoid the 50 mile drive to Intel to work every day.
The only 8080-based computer Intel had available was called "Intellec-8", but it didn't have any software or disk storage attached to it.  So Kildall obtained a used test floppy drive free from Shugart Associates, attached it to the Intellec-8 with a controller designed by his friend John Torode, and wrote a primitive operating system for it which he called CP/M.

662 Development
The company's seminal product was CP/M 2.0, which fully separated the three components of the operating system into logical pieces: the CCP (console command processor); the BDOS (Basic Disk Operating System); and the BIOS.  Only the BIOS need be provided by anyone to get CP/M running on a new machine, the CCP and BDOS would be unchanged.  CP/M 2.0 was quite buggy, and was quickly followed by 2.1 as a fix-up release.  However, 2.1 was limited in its internal capacity to small floppy drives, and by 1977, hard drives were coming on the scene.  CP/M version 2.2 added expanded disk formatting tables which could allow access to up to
8 (eight) megabytes per drive in up to 8 (eight) total drives.  It was version 2.2 that became the megabit that dominated microcomputing almost from its outset.

663
It was CP/M's adaptability that gave it appeal and launched it on the road to success.  It packed a surprising amount of power in a tiny package, and did so in a simple, clean logical way.  Many of its critics bemoaned its sometimes cryptic commands (rightly) and also its lack of powerful features.  But it must be remembered that CP/M was designed in an age when it was a rare, high-end computer owner that could afford the thousands of dollars it took to fill up the whole 64K of the 8080's address space.  The entire operating system took only 8K of the computer's memory, and would run in a mere 16K of total memory with room left over for any of its system development utilities to run.  More features would have swelled the system to the point where decently featured applications would have had no room to execute.

664
And it was the applications that moved this operating system out of the realm of the computer enthusiasts and into the hands of "real users" (people who don't care if their computers are powered by hamsters, so long as they run their necessary applications reliably).
The first real "killer app" for CP/M was probably WordStar, a word processing program that became very widely used.  Also famous was the first microcomputer database application, dBASE II.  These and many, many other applications and utilities eventually made CP/M a useful tool for a wide range of ordinary people.

665
* By 1981, a new generation of Intel microprocessors was on the horizon -- the 8086 and 8088 16-bit chips, which could address an incredible 1 megabyte of memory.  This seemed at the time more than anyone could ever figure out a use for, so Digital Research focused much of their attention on producing CP/M 3.0 for the dominant
8080/Z80 platform.  There were plans of course to port CP/M to the new 16-bit chips with a version called CP/M-86, but it was not a priority at the time.
* While DR did finally announce CP/M 3.0, a more full featured successor to the successful 2.2, the upgrade was only for 8080/Z80 based systems which were no longer seen as the coming thing by the public.  And CP/M-86 was ported to the IBM-PC, but by that time IBM was practically giving away the new PC-DOS operating system.  Except for a diehard core of those that loved it for what it was, CP/M began rapidly to vanish from the land of living operating systems.

666
9.5 Microsoft Joined
Microsoft rose to fame and power on the basis of the Disk Operating System, one of the most dramatic business coups of the twentieth century.  But while DOS was great it lacked the ease of use of the Apple system so Microsoft launched a project to create an operating to achieve the ease of use of Apple's operating system.  The result was Windows.  The first versions were not spectacularly successful technically and commercially but Microsoft continued to develop Windows until it became virtually the universal operating system for personal computers.  This was in part due to the technically capabilities and ease of use of Windows but it was also due to the marketing practices of Microsoft which resulted in every personal computer coming with Windows so the acquisition of any other operating system would superfluous and costly.

667
9.5.1 DOS
* Microsoft initially kept the IBM deal a secret from Seattle Computer Products.  And in what was to become another extremely fortuitous move, Bill Gates, the not uncontroversial founder of Microsoft, persuaded IBM to let his company retain marketing rights for the operating system separately from the IBM PC project.
Microsoft renamed it PC-DOS (the IBM version) and MS-DOS (the Microsoft version).  The two versions were initially nearly identical, but they eventually diverged.
* MS-DOS soared in popularity with the surge in the PC market.  Revenue from its sales fuelled Microsoft's phenomenal growth, and MS-DOS was the key to company's rapid emergence as the dominant firm in the software industry.  This product continued to be the largest single contributor to Microsoft's income well after it had become more famous for Windows.

668
The final major version was 7.0, which was released in 1995 as part of Microsoft Windows 95.  It featured close integration with that operating system, including support for long filenames and removal of numerous utilities, some of which were on the Windows 95 CDROM.  It was revised in 1997 with version 7.1, which added support for the FAT32 filesystem on hard disks.

669
Although many of the features were copied from UNIX, MS-DOS was never able to come anywhere close to UNIX in terms of performance or features.  For example, MS-DOS never became a serious multi-user or multitasking operating system (both of which were core features of UNIX right from the start) in spite of attempts to retrofit these capabilities.  Multitasking is the ability for a computer to run two or more programs simultaneously.


4518
Living Room / Re: General av and anti-malware discussion
« Last post by IainB on November 02, 2012, 07:21 PM »
...I use the .reg backup, ERUNT and make sure I have a recent restore point before cleaning out the crap...
Yes, I typically run ERUNT (option ticked to save all subhives) typically once a day, and make a daily restore point. I also try to ensure a restore point after any major update or prog. install.

I only became more rigorous in the use of ERUNT after the experience I described above.    :-[

These sorts of precautions could be very useful in recovering from some kind of "corrupted" file/registry entry, or malware infestation - so you could (say) blindly do a restore, and forget about doing a root cause analysis.
If the changes to the laptop software/system occurred in a process that was in statistical control, then such an approach might be valid, but the process is not in statistical control and therefore it is no more than just a pragmatic and expedient shortcut to take such an approach. We remain ignorant as to root cause, afterwards.
4519
It sometimes seems as though quite a lot of the issues affecting Internet freedoms tend to be largely ignored or "under-reported" by the MSM (MainStream Media), so that we interested parties (Internet users) actually only find out about things at the last minute.
Now why might that be?
[/sarc]

Thank goodness for a relatively independent and diverse blogosphere to keep us informed (and I do not incluse MSM reports hidden behind paywalls in that) when the MSM either cannot or will not do so.

Relevant post from Falkvinge on Infopolicy:
Berlusconi Convicted: What We Learn From Political Media Contamination, though I think the word "Contamination" might have really been intended as "Collusion" or "Corruption"(as per the URL).
Just the text copied below sans embedded links/images.)
by Rick Falkvinge

Diversity: Against all odds, former Italian prime minister Berlusconi was recently sentenced to one year in prison. This followed a long process where an Italian referendum had to be held to revoke his legal immunity, in order to indict him in the first place. We can learn a lot about the dangers of politically controlled media from how Berlusconi tried to defeat this referendum.

Silvio Berlusconi was sentenced to one year in prison for “notable tax evasion”, and prohibited from holding public office for five years. Experts say that this means that the 76-year-old’s political career is effectively over.

However, this case has been dragging on for a long time, and started out as a seemingly hopeless case since Berlusconi enjoyed legal immunity for acts committed during his prime ministry. To repeal this immunity, a referendum was required in Italy – a population-wide referendum just to allow the former prime minister – one man – to even stand trial.

To understand the complexity of this situation, three pieces of data are vital:
First, the functional illiteracy in Italy is 43% (yes, forty-three per cent). This means that almost half of the population can’t read an average-complexity newspaper article, or this blog post, and understand its content and use its information in their daily lives.

Second, referendums in Italy need two things to pass and take judicial effect. Out of the voting people, over 50% of the valid votes cast must be “yes” votes – simple enough; there must be a majority in favor of the referendum. But the second thing required is that the voter turnout must also be over 50%. This means that there are two mutually exclusive strategies for defeating an Italian referendum – either campaign heavily for a “no” vote, or not campaign at all in shooting for a voter turnout less than the required half.

Third, Silvio Berlusconi owns a controlling interest in six of the seven television networks in Italy.

Now, putting these three facts together, we observe that people in Italy do not get their daily news from newspapers (in fact, almost half of them are unable to do so), but from television and friends. We also observe that the programming on television is practically completely controlled by the single man in Italy who has anything to lose from the referendum passing.

So what happened?

The referendum wasn’t mentioned once on the televised news in six out of seven television networks. It was dismissed as “not newsworthy”, in all simplicity.

But the referendum passed anyway, reaching the required 50-percent voter turnout by and large thanks to the alternate newsflows of the net, which Silvio Berlusconi didn’t grow up with and which he hasn’t cared to understand. And so, Berlusconi was indicted. He stepped down from the prime ministry on practically the same day, and was sentenced to jail a few days ago, on October 26.

This story illustrates in a very straightforward way how media ownership, and the interest of the media owners, influences news valuation.

To put it bluntly, there is simply no such thing as “independent media” or “neutral media”. Ownership interests prevail; blood is thicker than water. Even public service channels choose to report from values in the middle lane. That’s not neutral; that’s in the middle lane.

Therefore, a plurality in reporting remains paramount – in order to hold elected leaders accountable, we need many reporters with many competing interests.
4520
...Maybe I'm just tired?...
Never mind. Just sit down, put your headphones on, and listen to this with your eyes closed. You'll feel more like your old self in a matter of minutes.



Not all people are on the same wavelength.    ;)
4521
Living Room / Re: General av and anti-malware discussion
« Last post by IainB on November 02, 2012, 04:33 PM »
Oh, I see. I wondered why you wrote what you did.
Yes, this malware thread did seem to have petered out - that's why I made the comment, just to help things along a bit.
The subject is not necessarily likely to be all that interesting to too many people. Probably the time when people are most likely to get really interested in malware discussions is when they actually have an active case to be concerned with on their own, or someone else's PC.
For example, I recently had a major problem with one of my laptops (the one my daughter uses), and it seemed like it might have been a trojan/hijack or something, but the virus and malware protection setup on the laptop was identical to what is on my main laptop, and I couldn't see how it could have been infected with anything - given the security blanket I had implemented.

I scoured all the forums and ran tests on the laptop every which way. Over an elapsed period of 4 or 5 weeks, I spent hours and hours investigating the problem, but to no avail, until I happened on a post on a forum where someone had documented the exact same problem, and he had discovered a fix for it in a web posting.
The causal problem was apparently a corrupted system file, in an area that you would not have intuitively expected to be associated with the problem. I still don't know how the corruption could have been caused though (incomplete root cause analysis).
Most people would probably have given up trying to figure it out and re-installed Windows, but I dislike such an approach, and in any event saw no need to dicsombobulate myself with a re-install and all that that implies. I happen to prefer identifying the problem, the cause and fixing it. Anyhow, I eventually got there, but cannot stop the problem from recurring because I still do not know how it might have happened in the first place. That's a result of an incomplete root cause analysis.
Actually, I might make write a separate discussion about this, as it potentially could be tremendously useful to someone who finds themselves with the same problem in future.
4522
Living Room / Re: General av and anti-malware discussion
« Last post by IainB on November 02, 2012, 02:16 AM »
I don't find hijacks really that much of a thrill. It can take hours to clean up all the files and hooks from a PC that has been infected with a trojan/hijack.
The best way I have found of cleaning up a PC infected with a trojan/hijack is to use Malwarebytes.    :Thmbsup:
The best way I have found to avoid/reduce the risk of getting a PC infected by a trojan/hijack is to use both a virus checker and Malwarebytes PRO together. They are complementary.

The virus package I have used (since it came out for free) is Micrsoft Security Essentials.    :Thmbsup:
4523
Living Room / Re: silly humor - post 'em here! [warning some NSFW and adult content]
« Last post by IainB on November 01, 2012, 02:26 PM »
Very droll. IE is a drooling idiot, or something.
4524
Living Room / Potentially unethical/unconstitutional alliances.
« Last post by IainB on October 31, 2012, 09:49 PM »
But of the industries that are leading directly to measures which "conveniently" overstep the stated mandate, it's Entertainment that keeps showing up, along with the War On Terror and Protect the Kiddies finishing the triumvirate. Gene patents are icky, but they don't seem to immediately lead to "we must examine blood samples daily to be sure you are not eating illegal corn."
Yes, but the explanation for that could be that there is a natural real/potential alliance over common ground between:
  • (a) The profit objectives of commercial interests (GCSes), and
  • (b) the expansion of State-control objectives of that much bigger "GSO" (Good Psychopathic Organisation), the State.
- thus, wherever such a natural real/potential alliance exists, you will probably see collaboration between, for example:
  • the State GSOs - e.g., EPA, Homeland Security, TSA, Judiciary, Police - and
  • other GSOs - e.g., PPA, GreenPeace, WWF, UN, WHO, IPCC - and
  • commercial GCSes - e.g., Big Oil; Big Tobacco; Big Media; Big Pharmacy; Big Food; Big Research (Monsanto and others); Big Internet/Marketing (Google).

These will necessarily/probably only be alliances of convenience, and you can bet that the collaboration will be obscured/hidden as best as possible (e.g., the UN's and IPCC's "impenetrably transparent" processes) and oiled by borderline or arguably corrupt/unethical practices, and typically motivated by revenue expectations in one form or another - e.g., carbon trading (a tax); royalty payments (another kind of tax); research funding (sharing of tax revenues); market share protection (a revenue guarantee); administration funding (sharing of tax revenues).

If you want to spot this happening, just apply Cadbury's "Ethical rule of thumb":
"The rule of thumb is that, if a business process can not stand the hard light of scrutiny, then there is probably something unethical about it". - Sir Adrian Cadbury (Chairman of the then Quaker family-owned Cadbury's) in his prize-winning article on Business Ethics for Harvard Business Review circa 1984.

This helps to explain why organisations put so much effort into delaying/rejecting FOI (Freedom Of Information) requests, with some GSOs even spending hundreds of thousands of what was originally taxpayers' money in stalling/defence tactics in the Courts. Avoidance/fear of discovery.

Putting your "own people" in as plants/political appointments and using a "revolving door" for appointments is all part of the game - stack the odds in your favour.
As an illustration, see Fed.Govt.+MPAA here, and coincidentally I read the other day that the UK's Labour party apparently have "placeholders" (Labour plants) on the Boards of almost all of the major charitable institutions in the UK. Now why would they do that?     :tellme:
"Becase we care about charitable work."
Yeah, right.
4525
Some years ago, I was assigned to manage a project to develop a functional BCP (Business Continuity Plan) for a major Australian-owned bank based in New Zealand. The BCP had to be aligned with the stringent BCP standards mandated by the Australian parent. Being volcano-prone, earthquake-prone and tsunami-prone as an island on the "Pacific Rim of Fire", NZ already had/has a well-established MCD (Ministry of Civil Defence & Emergency Management).
We collaborated closely with the MCD in developing the BCP, and during the course of that collaboration I learned how well-prepared NZ MCD was for naturally-occurring disastrous events. The level of preparedness was very impressive. For example, for years now, things like mandatory building standards have taken into account the need for buildings to behave in a certain manner for maximum safety, in the event of an earthquake tremor. These standards are constantly being improved and lessons are being learned even now from investigations/reviews of the outcomes of past disastrous events - e.g., things like the recent Christchurch earthquake(s).
Things could always be improved, and will be, but I think that the current level of planning and preparedness is quite impressive anyway.

My curiosity was thus sparked by the two underlined comments by Roger Pielke Snr. (quoted in the thread above):
  • 1. The civil defense preparedness/response to H-Sandy was apparently effective:
    Also, with a storm of this magnitude, the National Hurricane Center, the National Center for Environmental Prediction, the media and public officials must be recognized and commended for their early warming. This has resulted in a much lower loss of life than would have otherwise occurred.

  • 2. The implication that policy for hurricane preparedness might not be sufficiently effective re land use planning.
    Regardless, how, or if, the risk from hurricane landfalls of this type increases in the future, a prudent policy path would be to reduce the risk from all plausible hurricane landfalls. through more effective land use planning.

On the second point: I cannot imagine under what circumstances any civil defence responsibility could justify being negligent in not having adequate and effective land-use planning already in place. This when you have known for years (QED Pielke's book Hurricanes: Their nature and impacts on society) that hurricanes are going to come rolling inland off the sea with some kind of monotonous cyclical, clockwork regularity, and that land-use planning would make a significant difference to risk mitigation/avoidance. The mind boggles.

On the first point: H-Sandy seems to have been used as a political opportunity. What I have read from various blogs and news outlets is that FEMA was apparently an actual/potential component of the effective response, and yet FEMA appears to be being used as a political football - e.g., FEMA: Did Mitt Call for its Abolition? And Why Does Barack Want to Cut Its Funding?
I cannot conceive of a natural disaster being used as a political opportunity, or civil defence being so cynically used as a political football, in little old NZ. However, I can understand why it might be politically expedient to so use it in the cut-throat politics of the US - though it seems to show an apparently acceptable, cynical and callous disregard for and indifference to human endangerment and suffering, by the Executive, the Administration, and others.
Regardless, I gather that Obama has seemed to come over as "quite presidential" in his handling of the H-Sandy opportunity, whereas the same opportunity has left Romney looking a bit weak after his "Abolish FEMA".

One wonders what might have occurred had H-Sandy not so conveniently eventuated around election-time, and whether the politicians and their mouthpieces in the MSM would have encouraged anyone to give a damn about the human victims of the hurricane.
[/Rant]
Pages: prev1 ... 176 177 178 179 180 [181] 182 183 184 185 186 ... 264next