topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Wednesday November 12, 2025, 12:09 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Recent Posts

Pages: prev1 ... 252 253 254 255 256 [257] 258 259 260 261 262 ... 264next
6401
The most powerful tool that I ever came across for getting things done was the "ABC" method. This was long before the term "GTD" was coined, and was taught to me by a project manager (I was working in a Business Analyst role) in about 1977.
I have used the ABC method over many years and to good effect, and modified/augmented it only slightly. I have coached people in its use, and they have then been able to quite literally transform and take control of their busy lives.

The ABC method of prioritisation for tasks:
  • A = Urgent AND Important
  • B = Important BUT NOT Urgent
  • C = Neither Urgent NOR Important
(There is arguably a logical 4th category: Urgent BUT NOT Important. However, this makes little sense as it is ambiguous, so it gets left out.)

The ABC method of prioritisation for user requirements in a system development:
A = Mandatory
B = Highly desirable
C = Nice-to-have

The ABC method for tasks focusses the mind wonderfully, and helps you to PLAN and to make decisions:
  • You only need to worry about dates for the "A's" - as they are urgent. Some "A's" might need to be done before others - dependencies.
  • You therefore address the "A's" first.
  • The "B's" can be picked up and worked on as and when you have some slack time whilst addressing the "A's".
  • The "C's" can be forgotten - because they are largely irrelevant (by definition) in the overall scheme of things.
  • Events outside of your control may cause the priorities to be up- or down-graded.

The ABC method for requirements also focusses the mind wonderfully, and helps you to PLAN and to make decisions:
  • You MUST address the "A's" - all of them, and you need to identify any potential/actual interdependencies.
  • You negotiate with the users as to which "B's" are going to be included.
  • The "B's" usually take 2nd place to the "A's" in the queue for resource allocation.
  • The "C's" can be forgotten - because they are largely irrelevant (by definition) in the overall scheme of things.
  • Events outside of your control may cause the priorities to be up- or down-graded.

How I applied the method:
* Media: I tried using index cards, but it was too fiddly and so I moved to having 3 clear plastic folders - one for each of the 3 priorities.
In each folder I could have n sheets of paper forms. Each form was photocopied from a template. Each sheet had ruled lines down the page, with columns reading from left to right, as follows. You could only write on the form in pencil.
   *   Priority: value could be A or B or C, but you only needed to write the priority in if/when it changed from that of the sheet.
   *   Details/References: value is free text.
   *   Action: value is typically one or more of these:
                  Status flag: ToDo; WIP; Done; ARR (Awaiting Response/Reply);
                  Who: the initials of the person who was assigned to carry out this task;
                  Activity required: Do it; Call; Email; Meeting;
                  Date: due (usually for "A's" only); done.
                (You need to use pencil and eraser to update these.)

* Priority changes: Rarely - and usually as a result of a mistake somewhere - an "A" entry would be downgraded to "B", and similarly a "B" would be downgraded to a "C".
The usual change was a "B" entry being upgraded to an "A", as and when it became time-critical.
To effect the change, you transcribe that entry onto the other appropriate form and rub it off that line where it had been on the original form (where it now no longer belonged).

ABC computer-based method: In 1989/1990, I started using Lotus Agenda, a PIM which was ideal for automating the recording and dynamic updating of the ToDo list and details of any associated data. Lotus Agenda is obsolete now, and I have not found any software that can perform as well as this since, so it is back to the paper-based method. However, I was surprised to see that something similar was starting to be achieved with a Firefox/Chrome add-on called "GTD for Gmail" (still in ß), until they changed ("improved") it a week ago and apparently ruined the emerging potential. I think the developers probably failed to see/understand the significance or potential of what it was that they were building - you sometimes get a lot of that in IT.

Hope this helps or is of use.
6402
Update on progress with trial of Soluto:
The two screenshots (below) are the "Dashboard" (as I called it) and impart quite a lot of useful information - knowledge, actually. On either screen, you can zoom in and dynamically view details of the individual boot-up components involved. You can also decide to make changes here to the boot-up component queue - though, as yet, Soluto does not allow the user to force/set priority or sequential order of components for start-up.

I would normally prefer to see tables of numbers than a diagram, but I have to say that I could not so easily or rapidly obtain the knowledge that these dynamically interactive charts here impart, regarding the status and the progress of my boot-up performance over time.
The graph shows that the earlier savings - which were lost after a huge Windows Office update - have been recovered. The blip is after reset and recovery from a BSOD event. I rarely get BSODs, and I would tend to attribute this event to the aforementioned updates (including several changes that I made to applications).

Incidentally, an automatically started Service process called PAStiSvc.exe had appeared after all the changes. It looked vaguely familiar. After googling it, I couldn't see that it was necessary for anything in my system, and the file date was 2005, so I set it to "Pause".
(By the way the Soluto "Pause" means that the Service is set to "Manual" - as can be seen in the Sevices.msc panel.)

Verdict so far:
I initially thought that Soluto ß was too simplistic, but now I am not so sure. Anyway, I shall try to keep an objective mind and keep the trial running, and shall report back to this forum for anyone who might be interested.
Soluto - component overview  2010 0613.png     Soluto - performance graph 2010 0613.png
6403
Screenshot Captor / Re: Screenshot Captor Splice Tool
« Last post by IainB on June 11, 2010, 01:43 PM »
@mouser: Thanks. That is a rather nifty addition to my fav screen capture tool.     :up:
6404
@jaden:
You say:
Soluto told me that Soluto added 7 seconds to the boot time, plus it was using around 40MB of RAM.

Yes. On my laptop (running XP Pro SP3 and with all MS updates):
  • Soluto tells me that it takes 11.9 sec. of boot-up time, and has a "disk load" of 28MB.
  • Process Explorer tells me Soluto has a "virtual size" of 198,732K.

Soluto recommends of itself:
Keep it in boot, as it improves the operation of your PC by giving you control over the applications launching in your boot.
- which I think is a reasonable recommendation.
By the way, over in the Soluto discussion forum, they say:
You should keep Soluto since it protects your OS startup.
Somehow unwanted software finds its way to your machine and loads automatically. This is why we run every boot.
Don't worry about resources Soluto takes on runtime. It moves into stand by mode if you do not touch the software.

Where you say:
I could see it being quite useful for folks who don't know much about what's starting automatically.

No. The point I was trying to make above was that:
Soluto at least told me something that I did not know before - i.e., all the precise boot-up times statistics by components.

That is, I would seem to be better informed now about the performance of my running processes when using Soluto, than when not using Soluto. I was already reasonably well-informed about automatic starts and running processes by virtue of using tools such as, for example, Autoruns, Process Explorer, and the Windows Services control panel, but these tools did not give me any real idea of process startup performance times.

Soluto thus offered new data and presented it in a novel and very intuitive manner, to enable me to make decisions about:
  • "Pause" - i.e. remove from boot/startup).
  • "Delay" - i.e., start up after boot has completed).
  • "In Boot" - i.e., keep in boot/startup.
The net effect is that, as well as giving me new performance data (i.e., that I did not previously have) about process boot/startup performance and resource utilisation, Soluto probably saves me "tweaking time" that would otherwise be spent playing around with the other aforementioned tools whilst investigating running processes. Another thing that I find particularly useful is that once you have selected one of "Pause" or "Delay" or "In Boot" buttons for any given process, that process is then moved to the appropriate category in the dynamically interactive chart, where you can subsequently go and view it and select one of the other category buttons if you change your mind.
Whilst you are fiddling around like this, Soluto is keeping score of the last aggregate/total boot-up time (and its components) and what effect you will have on boot-up time with the changes you are making.

I would call this sort of control a dynamic "dashboard" control, and it is one of the most elegant and novel designs of a dashboard that I have come across. It is ergonomically quite well-designed - though I could ask for some improvements - and it is relatively idiot-proof. I would think it would be hard to beat for ease and simplicity of use, yet it's not a dumbed-down tool. That is, it still gives the user the flexibility and power to monitor, control and make decisions about optimising the performance of the full range of relatively complex boot/startup process operations.

Soluto is still in ß, but, because of the above points, I think it would bear watching to see how it develops.
6405
Find And Run Robot / Re: Latest FARR Release v2.107.04 beta - Sep 23, 2012
« Last post by IainB on June 10, 2010, 12:42 AM »
@mouser: In answer to your question:
...do you have any other plugins that use fscript.dll that work?

I presume all the plugins work (I don't know that any have stopped working). There just seem to be three that cause FARR to crash on startup.
Offhand, I don't know which of the plugins use fscript.dll, but here is a list of the folders in my Plugins folder:
  • Clipboard Help+Spell
  • Clock
  • ExternalSearch
  • FARRAltTab
  • FarrFox
  • FARRGoogleSuggest
  • FarrMilk
  • FarrMostRecentlyUsed
  • FarrUninstall
  • FCalc
  • Gcal_quicadd
  • Google_Translate
  • hamnotes
  • KlipKeeper
  • NWConnect
  • PrinterList
  • ProcessKill
  • ProcessTamer
  • ScreenshotCaptor
  • SendMessage
  • ServiceCtrl
  • Uninstalled Plugins
  • URLSnooper

The Uninstalled Plugins folder is where I have put the following plugins that cause FARR to crash on startup (they are in ZIPped folders.):
  • Gmail Tasks
  • GooglePlus
  • TimeZone
6406
Find And Run Robot / Re: Latest FARR Release v2.107.04 beta - Sep 23, 2012
« Last post by IainB on June 09, 2010, 04:09 PM »
@mouser: OK, following on from my post above, I have just established that FARR crashes on startup if any one of these plugins are in the FARR directory:
  • Gmail Tasks
  • GooglePlus
  • TimeZone

So, I have removed them, and now FARR starts up fine.
I have no idea why these plugins should suddenly seem to start to cause offence.
6407
@hpearce: What tool did you use to measure your component boot-up times? I had not previously been able to do that - so Soluto at least told me something that I did not know before - i.e., all the precise boot-up times statistics by components.
6408
I have updated my post above with some more info. that you might not have not seen yet.
6409
@40hz: Yes, PSI is really rather good isn't it?. The potential stability/performance issues were in my mind too. I had considered leaving PSI out of my startup components, but as it contributes only 5 seconds of my boot-up time and does not seem to be an annoying CPU or disk I/O overhead on my laptop (Intel Core Duo 1.8Ghz), I have left it in boot for the meantime. I may yet prune it out and only run it intermittently on demand - or maybe schedule it as a periodic job.
6410
I have been trialling some new PC performance monitoring software that I stumbled across - Soluto Beta
It's free at present, and may stay that way.

But wow! I am truly impressed by Soluto. Nice job.    :up:    :up:    :up:
As a one-time software programmer and developer, I am always on the lookout for new/specialised software, and enjoy trialling/testing it. I have become skeptical about it though, as the majority is kinda mediocre.
I started using Soluto on 2010-06-07. On my first re-boot, the savings on boot-up were a little over 2 minutes off my 6-minute boot-up time. After some more tweaking after that, and then a huge Windows Update today,  the time has gone back up to 6:03 minutes, so there's still more tweaking required, methinks. Soluto lets you make decisions as to whether to postpone startup of components until after boot ("Delay") or temporarily suspend them from startup ("Pause") or leave them "IN Boot". "No brainers" are recommended for your special attention to do something about. The concept is rather clever, using crowdsourcing feedback for data on the components and providing a rather nifty dynamically interactive timeline which shows the system changes you made before and after using Soluto, so you can see how your boot-up time has varied over time and what the possible causes of variation were. There is also a rather nifty dynamically interactive component chart, with components arranged in startup time order.

Because of this relatively scientific approach, there's no more guesswork in identifying the boot-up criminals, and because Soluto times your boot-up by component I now know that (for example):
  • ZoneAlarm's True Vector Internet Monitor takes a whopping 46 seconds in boot-up.
  • Windows' System takes 62 seconds.
  • MS Anti-malware takes 53 seconds.
  • Clipboard Help and Spell takes 9 seconds.
  • FARR takes 7 seconds
I am not sure whether I shall keep Soluto longer term - it depends on how useful it is after a longer period of use - but the omens for keeping it are pretty positive so far.

Hoping this this might be of use/help to somebody.
6411
I am usually scouting round for new PC security and performance monitoring software, and trialling it out.
I stumbled on a blog post by someone who claimed to put a great deal of faith in Secunia Personal Software Inspector (PSI). I had never heard of it, though I found it referred to in some old discussions on the DC forum.

PSI is free to download for personal/non-commercial users from the secunia.com website: Secunia Personal Software Inspector (PSI)
So, I downloaded PSI and have had it running for a few weeks now. It does what it says it will do, and is rather impressive - seems to be quite rigorous.    :up:   :up:   :up:
In particular, I found these useful:
  • The identification of "Insecure" software on your PC - i.e., software versions for which there are known security updates available.
  • The identification of "End-of-Life" software on your PC - i.e., software versions which the vendor no longer supports.
  • A percentage rating of your PC software security status, mapped over time in a little histogram, so you can see whether things have improved after you have made some of the recommended changes to your PC's risky software.

PSI offers solutions for the risky software, taking you to appropriate links for the software suppliers' updates (where there are any).

Hoping this this might be of use/help to somebody.
6412
General Software Discussion / Re: Software for Business Process Modeling?
« Last post by IainB on June 08, 2010, 10:47 AM »
@superboyac:
Here's the link to the CA Allfusion CASE toolset.
AllFusion Process Modeler
6413
Find And Run Robot / Re: Latest FARR Release v2.107.04 beta - Sep 23, 2012
« Last post by IainB on June 05, 2010, 12:36 PM »
Well, I'm not sure what's happening now.
Having just rebooted my laptop, I get this error:
---------------------------
Find and Run Robot
---------------------------
ATTENTION: Because it appeared to crash on the last run, the following plugin has been disabled:
C:\UTIL\Windows utilities\FindAndRunRobot\Plugins\Gmail Tasks\FScript.dll

NOTE: The plugin will attempt to be loaded on the next run of the program,
 so to prevent this from happening in the future, please remove or
 update this plugin and report the problem on the www.DonationCoder.com forum.
---------------------------
OK   
---------------------------

Then I get:
---------------------------
Microsoft Visual C++ Runtime Library
---------------------------
Runtime Error!

Program: ...IL\Windows utilities\FindAndRunRobot\FindAndRunRobot.exe

This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information.
---------------------------
OK   
---------------------------
FARR then crashes.
So I removed Gmail Tasks from the equation, and next time another plugin is found to be faulty.
And then FARR just crashes again (same pattern) every time I restart it.
When I restore Gmail Tasks, it's the guilty party again.
Help! I have become addicted to using FARR and now I can't get my fix!
6414
General Software Discussion / Re: Software for Business Process Modeling?
« Last post by IainB on June 04, 2010, 04:03 AM »
@superboyac: In reponse to your post:
By the way, you can get hold of the abovementioned GAO manual here (click on link): GAO BPR Assessment Guide 1997

Defining a BP (Business Process): When you draw a diagram of a BP - typically into "swim lanes" - and (maybe) put words around it to describe the BP, so that it can all go into a hardcopy or online Intranet instruction manual for process participants (the people who actually operate the process) to refer to, then you are defining the BP.

Why would you define your BPs? Well, you are apparently wanting to do this to provide a BP instruction manual for your employer, where you say:
This is kind of a big deal for me and the organization.
If it is "a big deal", then you presumably want to make as good and professional a job of things as possible.
What you have is to do is operate a process for BP discovery, analysis, modelling and documentation. You might also want to add process improvement into that mix, because you are likely to stumble over potential areas for significant process improvement once you start to scrutinise a process.

You can either:
  • (a) Carry out that process in the conventional manner - i.e., based on little or no theory, conforms with received wisdom, is manual and relatively laborious, time-consuming, labour-intensive and thus expensive. (Most companies will take this track, because they don't know any different).
  • (b) Carry out that process using a sound theoretical approach and automation of the process as much as possible. Automation and acceleration of the discovery stage - of the process of business process analysis and re-engineering and communication of same - will result in these benefits: able to do more with fewer analysts, in roughly less than half the usual time it would normally be expected to take, and at lower cost. This is what my Oracle vendor client (see previous post) did for their ITIL process work.

You start with this premise: Anything that an organisation does is a process (a BP), by definition.
"If you can't describe your processes, then you don't know what you are doing." - W.E.Deming.

Procedure or process manuals have been around since the Industrial Revolution started, and thousands of man-hours ($$$ costs) have been and are spent on producing such documents and maintaining them. They have to be maintained if they are to keep up with the reality - which is that BPs have a tendency to undergo change/improvement in real life, regardless of the static definition of the process in a manual. Without this maintenance effort, the majority of procedure/process documents tend to gather dust and rapidly become out-of-date. In some industries, it is mandatory under law to keep process and equipment documentation up-to-date - for example, in the power supply industry where the unwitting use of outdated manuals for isolating and servicing high-voltage transformers has resulted in engineers' deaths by electrocution.
As a general rule, when in doubt, automate. If you can automate the process of BP definition and communication and automate the production of BP manuals and changes to same, to online Intranet sites, then you are likely to be able to avoid spending a heap of money on that work and on hardcopy materials and administration and maintenance of all/any forms of that documentation.

Theory: regarding the need for theory:
"Action that is not based on sound theory is irrational, by definition." (W Edwards Deming; and refer also Kepner-Tregoe)
There is a tremendous amount of ignorance, opinion and BS around the subjects of BPM (BP Modelling) and BPR (BP Re-engineering), and I have had to drag myself out of my own ignorance in that regard (still doing it).
There are three things that are probably the most useful thinking tools that I would suggest to help gain an in-depth understanding of BPs, what to do about BPs, and when:
You can use the CMM to determine what CMM Level different parts of your organisation are at.
From experience, many organisations rest at CMM Level 1 ("Ad hoc") or CMM Level 2 ("Repeatable") - both are kinda chaotic. Such organisations can and do waste a lot of money and produce a lot of poor or inconsistent quality output because of the "process thrashing" that typically occurs in those Levels (where BPs are in a state of dynamic change).

More theory: general management theory defines these four basic types of management:
  • Strategic management: Planning "the way ahead". Taking the helm - finding out what direction to take and steering things in that direction.
  • Operational Management: Planning for and managing and maintaining the stability of operational processes and the status quo of same, so that things tick along with minimal change/accident.
  • Project Management: Planning for and managing the life-cycle of projects to "do" something new (e.g., make a change to some process; implement a new computer system).
  • Crisis Management: This is about reactively dealing with events that flare up seemingly unexpectedly, requiring heroic efforts to sort out. No planning is required. This is something that seems to come very naturally to us as a species (we naturally lack the ability to plan and have to learn to do it). Some managers are very good at crisis management and may even opine that things are best left untouched until they become a crisis and you are obliged to do something about them. Large companies with a CEO who prides himself on being a good crisis manager have been known to fail spectacularly. Many crisis managers can seem to be quite charismatic and are highly-regarded until The Collapse. Surprisingly, these people are allowed to vote and drive cars.        ;)

From experience, to work on process analysis/discovery and modelling, you ideally need to have the sponsorship of an Operations Manager. The other types are unlikely to be able to make time for nor sense of what needs to be done. The Ops Manager though will rapidly be able to see the importance of the work to his management responsibility, and that it could help him/her in the execution of the role.
Even with that support, if the organisation is at CMM Level 1 or 2 or somewhere in-between, and if you try to define a process, then good luck - the outcome will be uncertain. You will define it one day, and the next day it will have changed, so your definition will be "wrong" now. Such organisations are not in an appropriate state of readiness for deliberate, developmental change/improvement. So, you have been warned: if you hook yourself to the end of a flailing piece of rope, it'll be a bouncy ride and probably totally unproductive in the general scheme of things. Having tried to ride that rope and beat it against all odds, I have learned to walk away from work where the client is too chaotic to benefit from BP modelling/definition. However, if management understand the theory and the need to improve the capability maturity of core processes, then you may well get somewhere with it all, but you won't be able to do it on your own (that's why clients call me in as a consultant).

Drawing a diagram of a process can be the start of process definition. You can draw it as though it were on paper, using a diagramming/drawing tool. However, if you were able to draw that diagram in a CASE (Computer-Aided Systems Engineering) tool that captured your lines, logic and boxes and turned them into records and record attributes in a database (repository) of your corporate processes, then you would be in a novel position. You could plot the process by department (swim lanes) or cost centre (swim lanes) or by any other attribute you can think of. You can also achieve a lot of that stuff that I referred to in my earlier post. If an organisiation gets to this stage, then it will be more likely to be able to move towards CMM Level 3, at least.

However, if you use some kind of system design tool - e.g., the BPMN tool such as Aris (the Oracle tool uses that) - then I would suggest great caution. From experience, the BPMN standard is appealing to IT people (QED). It is also very expensive. Many IT system architect groups have bought this hugely expensive software because they like it (though they may have all sorts of rationalisations for buying it). Having bought it, they then find that they have little real use for it, and the thing languishes in a corner somewhere. They treat it like a hammer and go looking for nails. Every time someone wants to draw a diagram of anything, and especially when someone says "business process", out comes the old hammer. They have to make it look like it was worth spending all that money on - see? They will insist that there is no better hammer than this one - take their word for it! They know they are right, and they know better than you! It's rather like an ideological zealot. They Believe and so must you!
Belief is irrational, by definition, and Edward de Bono described this sort of situation as irrational intellectual deadlock that often affects the more intelligent (it relates to ego, which has to be "right" at all costs). I reckon it is also Ahamkara, though it has been aptly analysed in the theory of psychology as cognitive dissonance (refer the book, "When Prophecies Fail").

If you go back to Deming's quote about theory:
"Action that is not based on sound theory is irrational, by definition."
- and if you look at the theory behind IDEF0/IDEF3, then you will be amazed at how simple and applicable it is to business processes (as opposed to IT systems), and you could not fail to notice that the IDEF0/3 standards enable a rigorous focus on the process. That is presumably why the US DoD had it developed, and why GAO and government agencies mandated its use.
In IDEF0/3, IT systems only figure as a "Mechanism" to enable (support) a process. Thus IT is a dog and is to be kept in its kennel. This would be anathema to an IT Believer who Believed that the system describes the process. This is pure belief/opinion, and nothing could be further from the rational truth (refer Deming). This is also why the Rational Use Case approach is next-to-useless for unambiguously analysing and communicating BPs, despite the IT ideology that Believers insist makes it quite the reverse.

This all about business processes, not about (zero) IT. Let the business managers make the decisions, not the IT guys - even in an IT company (again, like my Oracle client). Thus, when you say:
There are so many technical terms, and strange, complicated lingo for all of this.  What does this have to do with IT?
- It has nothing to do with IT. I would suggest that you don't let the IT guys hijack the BP analysis phase and turn it into a non-productive exercise in ITIM (IT Intellectual M#st#rb#tion). They can do that, you know. I once saw a large software development group brought to its knees because of intransigent internal bickering amongst two opposing IT ideological camps over whose methodology was thickest. It stopped a major redevelopment project (the biggest such project in NZ banking) dead in its tracks whilst the debate raged. Eventually the owners shut the company down and made them all redundant. No satisfaction in your Belief being "right" there, on either side.
 

Closing with a couple of Deming's favourite quotes:
* Who is it that darkeneth counsel by words without knowledge? Job 38:2
* My people are destroyed for lack of knowledge. Hosea 4:6
6415
General Software Discussion / Re: Software for Business Process Modeling?
« Last post by IainB on June 03, 2010, 12:34 AM »
@superboyac: What exactly are the requirements and outcomes that you have when you say:
"What I need is something that can do Business Process Modeling."
For example - swim-lane diagrams (which you have already identified).
I ask this because your requirements will generally tend to determine what is the most useful tool for you.

(Sorry for this rather long post, but I thought it could be useful.)

The last time I researched this for a client, to answer a similar question, several points came to the surface:
  • There are approx 75 software packages that were on the market that professed to "do BPM".
  • Drawing/diagramming tools: the majority of these packages (about 70) were diagramming tools. These are essentially designed for drawing charts onto paper-based output, and some are identified in the discussion above - EDGE Diagrammer, SmartDraw, iGrafix, MS Visio, MS Word and PowePoint (yes, those too are legitimate diagramming tools!). They are all pretty good at what they do.
  • There are several different "methodologies" that you find in use:
  • * Process symbol drawing standards: the classic "Taylor" diagramming standards (unchanged from the Industrial Revolution) - archaic, but still in use today and perfectly valid for process diagramming purposes. These diagrams are unambiguous and easy to communicate to users.
  • * Process system logic drawing standards: the main one is Rational Use Case. These diagrams are great for system design, but near-impossible to communicate unambiguously to users.
  • * BPMN (Business Process Modelling Notation): This is a relatively new standard having emerged over the last 6 years or so, and seems to have been developed by a consortium of software suppliers interested in devising a new standard and cornering the market for that standard. The software using BPMN enables logical modelling and diagramming, but tends to be very costly - Aris, Sparx, Oracle BPA Suite (is Aris), Telelogic System Architect. BPMN diagrams are relatively unambiguous and easy to communicate to users. Costs are typically about US$30K+ to distribute this category of software via a server in an enterprise.
  • * IDEF0/IDEF3: these two IDEF standards have been around for years and have stood the test of time. Theyare very simple and logical. Diagrams using these standards are very easy for users to comprehend and critique.  These standards were originally developed for US Dept of Defense use, and the software to build models using these standards was funded by DoD. The FiPS for these standards are dated about 1991 (from memory). IDEF0/IDEF3 are two of a set of about 15 standards, and are the ones that relate to BP modelling and decomposing such models into DFDs (Data Flow Diagrams). IDEF was de rigeur for all US government department process modelling, has been used by government and commercial enterprises in the US, Canada, Europe, Australasia, Vietnam and Thailand. There is a text or guide book issued by the US GAO that describes when/how to use it for BP analysis, improvement, and rationalisation. The software is now owned by CA and is called CA Allfusion BPM. It is part of a CASE tool suite where you can, for example, feed BPM models into an E-R modelling tool, and feed E-R models into a BPM model - it is very powerful and time-saving. This is proper BP modelling where models exist in a logically coherent database, and you never need to print out your models on paper. You project them onto a wall and discuss them with users, dynamically changing them as you discuss them. When finished, documentation can be automatically generated (including diagrams) in HTML - so it can all be on an Intranet for people across the enterprise to access. (No paper required.)

In 1998 I found myself in a bit of a spot as the lead BP engineering consultant when we bid for and won a contract in Thailand, where we had to collect and re-engineer all the core processes of a Thai government Department, and draw up specs for systems to support these processes, for a 7-year plan to automate what were then largely manually-operated processes.  We were going to do it the conventional way (paper-based) as we had been taught in school. However, when we got there, we discovered that another - rather famous - consultancy had preceded us, offering to do much the same thing, and had been thrown out and not paid - their work had not been perceived to add any value. Fortunately, one of our team was a data analyst, and he was using an E-R modelling tool called ERwin, and he said it was part of a CASE suite of tools that included a BP modelling component (BPwin), that could enable capture of the processes in a logically coherent database. It used standards called IDEF0/IDEF3 that I had never heard of before. I quickly obtained a trial copy of BPwin. The IDEF0/IDEF3 standards were brilliantly simple, and the tool was not difficult to learn to use (I taught myself, using the handbook). Over a long weekend (4 days, with little sleep) I succeeded in developing over 100 AS-IS processes model diagrams covering the core processes of the client organisation - all drawn from our consultants' extensive field notes (discovery). I was amazed at having been able to achieve this and the client was blown away by it (so was I!). We then taught the client process improvement group personnel to use the tool and collaborate with us in improving the models and re-engineering them into TO-BE models. These formed the basis for future system development. The models were held in a database where different modellers could be assigned to work on different parts of the models, so they would be "checked out" of the database whilst someone was working on them, and checked back in again when finished and ratified for integrity by the process participants and owners.

What this Thailand experience taught me and the other consultants is:
  • that we need to raise the process of business process modelling to CMM Level 3 or higher, and that we can do this if we automate the process of BP modelling to a large extent.
  • that we needed to be constantly aware of emerging standards and technologies and how they could be employed to help us improve the quality and speed of delivery of BP consulting assignments.
  • that the use of CASE tools such as BPwin enable improved productivity so that you can expect to save 50% on the timescale for a typical BP analysis, save 50% in people (need fewer BP analysts), with corresponding very significant cost-savings. (This has also been demonstrated and achieved in several similar assignments subsequently.)
  • that a lot of good and useful thinking is done via DoD-sponsored methodologies and tools (e.g., PERT, CMM, IDEF0-14, BPwin).

By the way, the BPwin software was ultimately acquired by CA and is now the BP modelling component in CA Allfusion Suite.

After the Thailand experience, I made a firm promise to "never again go back to using the old process diagramming/modelling approaches", and I have assiduously tried out all the new/changed BP modelling tools. I am not really interested in and have little use for process drawing/diagramming tools - they are all fine. Most of them will be able to "support" BPMN and IDEF0/IDEF3 - which only means that they can draw the boxes, lines and other symbols.

Amongst the pukka modelling tools, the BPMN standards tools - though very expensive - are very good, and, because they lend themselves more to IT systems design (which is why IT people will tend to prefer them and find them easier to understand) they are quite popular with IT system architects, who sometimes even seem to develop a religio-ideological fervour about them.
However, for pure business process modelling (not system design) the BPMN tools must come a distant second to the IDEF0/IDEF3 process modelling standards tools (there is only one really - CA Allfusion BP) - when what you want is absolutely clear and unambiguous definition of business processes. This tool includes ABC (Activity-Based Costing) which is a real boon to evaluating the cost-effectiveness of your AS-IS or TO-BE process whilst you are modelling it - after all, it's a business decision as opposed to a technical decision as to how to operate any business process, so the activity costs will be important. This tool is relatively low cost (about US$3K per copy, and you won't need many copies) and would be indispensable to any organisation trying to move up the CMM continuum to Level 3 and above.

I will generally use whatever standards my clients want me to use on an assignment. When I was contracted to doing a business process re-engineering exercise for a major Oracle vendor in 2009, I was told they wanted to use BPMN and the Oracle BPA Suite (this is the Aris software, but with an Oracle badge). They had not used either before. When they started to enable their license to use it, they realised that it was still hugely expensive and that they could not afford it - even given the dealer discount. So they decided to use a non-BPMN process diagramming and documentation tool that was owned by one of their subsidiaries - with the same result (still hugely expensive and they could not afford it - even given the dealer discount). Then they asked my advice. I did a quick survey of the market and gave them the $ numbers and a recommendation - it was a business decision. Ironically, the client selected and bought CA Allfusion BP - because: it was cheapest; it met all their needs (they changed their standard from BPMN to IDEF0/IDEF3); they got ABC into the bargain (which gave unexpected benefits).
6416
Best Archive Tool / Re: Versions??
« Last post by IainB on May 29, 2010, 05:13 PM »
@JavaJones:
Thanks for the Wikipedia link. I had not seen that before.
"Fractal compression" was withdrawn from the market because it actually didn't provide as significant a benefit over modern JPG compression as was hoped.
Where did you get that from? I'm not sure that it was "withdrawn", or at least, not for the reason given.

Digression re FC (Fractal Compression):
In the early '90s I was a product manager at a company that had bought into a dealership agreement with Iterated Systems for their fractal compression technology, the latter which consisted of (from memory):
  • A circuit board which you plugged into your PC bus - this had the necessary and patented compression algorithms in firmware and/or in the drivers and associated software for using the hardware. This was a "black box" and the only means of compressing the images.
  • Software to enable viewing of the compressed images - this had the necessary and patented decompression algorithms in the software, to enable on-screen viewing of compressed images. This was the only means of viewing the images.
Because the technology was thus constrained, and because both the compression hardware and the viewing software were relatively very expensive, this tended to limit the use of the technology as a solution to only large organisations with relevant application areas and which could cough up for the relatively steep prices. That meant, in this case, police (who have a huge database of mugshots) and newspapers (reduced transmission and storage resources required for moving news photos around). However, the police database at the time was mostly B&W, and FC offered minimal compression of those, so no takers, and the news organisations couldn't really cost-justify it, so no takers there either.

If you look at the Wikipedia article, it says "Patent restrictions have hindered widespread adoption of fractal compression.". That would probably be a gross understatement - it more likely died because of a bad marketing model. The technology seemed to be locked up with patents, constipated proprietary technology, exhorbitantly priced dealer licences and high product prices - all of which probably explains why we never sold any of that technology (as far as I can recall) and why I recommended we did not renew the costly dealership agreement we held with Iterated Systems. I was sorry about that though, because the technology seemed very good - it did offer significant compression of .JPG (i.e., already compressed) images. If things had been done differently, I often wonder whether FC might not have become as ubiquitous as, and possibly superseded, JPEG.
6417
@Jibz:
Unfortunately that article is terribly precise.
Hahaha, yes, very droll - "terribly precise".        :D
6418
I have been wondering why I get a sense of déjà vu about this discussion. So, I:
During the thinking part, it dawned on me: the discussion in this thread and all of the things referred to are rather like the discussion between the two characters (Vladimir and Estragon) in the play "Waiting for Godot". At one point they discuss what Godot (who, though he is the central subject of most of their discussion never actually appears) is going to do for them when he arrives. Vladimir, who is usually pretty sharp, struggles to remember and then says, "Oh ... nothing very definite."
When we try to establish what Wave is and what it is good for, we arrive at "nothing very definite", yet we are apparently allowing this indefinite/undefined THING to occupy a good deal of our cognitive surplus. Why is that?

Indeed, in the vid clip above, Lars Rasmussen (Manager, Google Wave) starts off by saying how "...today Google Wave is a product that people are using to get productive work done all over the world...we are going to show you that today with a 90-minute video...I'm just kidding, we're not going to do that." In fact, Lars leaves us none the wiser as to exactly what concrete use Wave is for anything at all, though it seems that "everybody's interested in using it"  - or words to that effect. That looks like an appeal to the consensus - a logical fallacy that proves nothing.

Like I said in an earlier post in this discussion, Wave seems to be a really neat solution looking for a problem - it's a Thneed.
I suppose it could also be a modern version of the emperor's new clothes. Because those terribly clever people over at Google have invented something that neither they nor us seem able to define, we are perhaps reluctant to admit - for fear of seeming stupid - that we can't actually see any value in it. So we don't state the obvious, and instead settle for obligingly discussing around the nebulosity of the thing, in the hope that perhaps it will become clearer to us and then perhaps our limited intelligences will be able to comprehend.    :D
6419
FARR Plugins and Aliases / Re: An Extremely Simple Gmail Tasks Plugin
« Last post by IainB on May 28, 2010, 10:20 AM »
@rulfzid Many thanks for this plugin. I have just started using it. Neat idea! Just what I had been looking for.   :Thmbsup:
6420
Best Archive Tool / Re: Versions??
« Last post by IainB on May 28, 2010, 09:09 AM »
@Target: Thanks for the heads up on the .bmp compression benefit. I had forgotten that some image files will compress better than others. It might be worthwhile testing all the common image file formats to see a comparative scale of compression (using the same image saved into different formats.

Out of interest I just compressed:
  • a 3.4MB .bmp file into ZIP format, and it became 2.5MB. That's a 0.9MB saving, or 26.5% compression, which is significant - though not huge (like in a document file) and might not be worth the effort.
  • a 1.2MB .jpg file into ZIP format, and it became 1.2MB. (So no saving there.)
  • a 1.9MB .png file into ZIP format, and it became 1.9MB. (So no saving there.)
  • a 3.7MB .tif file into ZIP format, and it became 2.2MB. That's a 1.5MB saving, or 40.5% compression, which is significant - and well worth the effort.

adi_barb was using .png files, according to his post. If he has a lot of TIF files, ZIP compression might well be worth looking at.

As for Total Commander, I had trialled that extensively - a few years back now, I guess. I recall it was very good - one of the best - but, if the version I trialled was able to manipulate/edit compressed files as though they were folders, then I guess it did not come up to spec., otherwise I would have stuck with it. It sounds as though it would meet spec. now though. What is TC like when manipulating files in a ZIP archive? Is it slow and disk-intensive - like I recalled these sorts of proggies were?

Meanwhile I am stuck with a HUGE (and growing) collection of mostly .jpg files, and I would love to find something to compress them to 40% or more. It's a pity that the fractal compression technology seemed to have been withdrawn from the market several years ago.
6421
Best Archive Tool / Re: Versions??
« Last post by IainB on May 27, 2010, 09:41 PM »
@adi_barb: iZArc puts the compressed files in whatever location is specified in the settings (Check Options|Folders tab). It sounds as though you had the same experience I had when first using iZArc - "Huh? Where'd my file go?"

Compressing image files:
  • If you are putting image files into ZIP or any other compressed file and think this is compressing them, then you might be surprised to know that no significant compression takes place. This is because there is no "air space" to compress in image files. The only compression system I know of that achieved significant compression on image files was fractal compression, which seems to be no longer commercially available. Though it was a "lossy" compression system, fractal compression could compress individual image files by astonishing amounts - e.g., 60% of a colour .JPG image (and JPG is a a compressed format in the first place), and with no visible loss of picture quality. Fractal compression worked best on colour images, with a reduced level of compression for B&W images.
  • The only time when putting images into a ZIP file might be useful is when you want to move or copy them all as a cohesive collection. Just one file move as opposed to many file moves. It's a bit faster to move one file than many.

On document files though, for example, you could typically achieve 60% to 80% compression using common compression methods.

If you want to work on (e.g., edit) any file (image or document) that is in a .ZIP file, this is extremely tedious. I think the operating system has to copy ALL files in the compressed file out to a Temp area, edit the ONE you want, and then save them ALL back to a new compressed file. Lots of unproductive I/O and disk thrashing there. A few years ago I trialled some systems out that promised to enable you to do this on .ZIP files containing many files, just as if they were an ordinary sysem folder, but they proved to be slow and highly unreliabe. They crashed, and lost or corrupted your data. My recommendation would be to steer clear of them.

Hope this helps of is of use.
6422
@JavaJones:
I'll keep watching Wave and if anyone I work with has any enthusiasm for experimenting with it I'll gladly participate. But I don't have any such enthusiasm personally at this point.
What you said is pretty much the conclusion I had arrived at.

It's as though Wave is a really neat solution looking for a problem, or like the Onceler's "Thneed" in Dr Seuss' "The Lorax": "It's a scarf, it's a hat, it's a bicycle seat cover. A Thneed's a fine-something that all people need!" (or words to that effect).
6423
Living Room / Re: Free ALZip 8.0 License
« Last post by IainB on May 20, 2010, 12:39 PM »
@Renegade: Thanks for the tip.    :Thmbsup:
I have just registered for a Free ALZip License.

I have been a long-term user of an old version of WinRAR - which is very good.
After a recent discussion in the DC Forum, I started trying out IZArc - which is very good at what it does - but I am not entirely happy with it.
A look at ALZip might help me to clarify my requirements.
6424
Living Room / Re: What annoys you to no end?
« Last post by IainB on May 19, 2010, 09:51 AM »
@40hz: Yes, thanks for the reading/links.      :up:
6425
Site/Forum Features / Re: Friendly 404
« Last post by IainB on May 18, 2010, 08:34 PM »
@mouser:
Here is my entry, using Nudone's image of cody from our last NANY event:
https://www.donation...r.com/About/404.html
I think that is a great 404 design.     :Thmbsup:
+1 vote from me.

Out of interest, take a look at this for an amusing 404 from Snopes.com
http://snopes.com/Peter%20Cosgrove
Pages: prev1 ... 252 253 254 255 256 [257] 258 259 260 261 262 ... 264next