ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

A software engineer might tell you that the fastest code is...

<< < (3/3)

Deozaan:
Yep, that's it. The superscript wasn't in the quote! I didn't actually read the article because the quote seemed so wrong. :-[ :-[

(Though my calculator says 0.9550 is about 7.7%, not 8% but that's just getting nitpicky.) :-\

rkarman:
Think about this:  A system with 50 required parts, each one having 95% reliability, has an overall reliability of .9550^50, or barely 8%!
-Paul Keith (June 22, 2010, 12:54 AM)
--- End quote ---

If it think about this I come to the conclusion that it's not true... The internet is built with millions of components (if not more) that all have a 99.98% reliability. According to this theory the overall reliability would be something like:  .9998^1000000=0.000000..(80more zeroes)..1356 etc.

Even if we would consider the internet build of a total of 10000 devices with each a reliability of 99.99999% we still end up with an overall reliability of 1%.

I think that the formula used is somewhat flawed or can clearly not "simply" be used as generic as it is used in this text. So the simplicity of the original article proved that too much simplicity also introduces errors.

The real lesson is ... Simple == Unreliable, Complex == Unreliable. Reliability is not just an accidental by product of complexity, it needs to be designed and implemented. The author of "Lessons from Grand Central Terminal" clearly didn't have reliability in mind when designing & implementing his article.



mwb1100:
I think the key is the word 'required'.  That implies that if any one of the parts fails, the whole system fails.

The Internet was explicitly designed so that a failure in one (or even more) part(s) wouldn't result in the whole network failing.

But, I think this still makes a point counter to the article:  the article's premise is that complex legislation (or whatever) is almost by definition prone to failure due to its complexity. However, one could make the argument that even a failure in one more components of a piece of complex legislation doesn't necessarily render the entire thing a failure.

JavaJones:
In fact, most legal documents are written with boilerplate like "If any part of this agreement shall be deemed unenforceable, this shall not affect the enforceability of any other part", etc.

Still, uber-complex legislation is bad, mmmkay? :D

- Oshyan

rkarman:
Still, uber-complex legislation is bad, mmmkay?
-JavaJones (July 14, 2010, 07:28 PM)
--- End quote ---
I agree

The point of the "Lessons from Grand Central Terminal" article was that the complexity introduced in new laws was bad because it only creates more points of failure. Personally I think the bad part of the complexity in legislation is that it hides the fundamental flaws in them (especially from the usually not so bright politicians).

Take the BP example of the article: "Even the government’s response to the tragically ongoing BP oil spill has been one of triangulation and determined-complexity. Get some supertankers to siphon off the leaking oil? Nope. Help Louisiana Gov. Jindal to build some temporary barrier islands along parts of the coastline?  No sir.  Keep a boot on the throat of BP — hey, that’s a killer sound bite!  Let’s go with that!"

The solutions proposed by the article are overly simplistic. They will work and do well for this specific case, but they will not prevent a next oil spill… The harm here is that the solution is still flawed, even though it looks solid.

Corporations are invented with the sole purpose to parasitize off of the common wealth. If you designed the law this way, you should not put out the accidental fire it caused and then think you solved the structural problem. Corporations are by law not liable for the cost they impose to future generations and the public as a side effect of making a quick buck. No oil drilling company has to provide replacement energy reserves for future generations, and no oil company pays for cleaning up future oil spills. This lending upon the future is true for any kind of corporation.

BP was allowed to parasitize society. They are even required to do so by law!

The main problem of the article was again not analyzing the validity of the example used and misusing the example in such a way it has a killer sound bite...



Anyway I was more interested in the software design side of this discussion, since I do believe complexity is a problem when it grows exponentially (which it does almost automatically if you are not very strict in maintaining architecture).

Take building a house as an analogy.
Doghouse (or small software): You take a few timbers and nails; throw them together and voila a doghouse.

Let’s size it up 10 times.

Normal House (medium sized software): You take a few bricks and mortar; throw them together and voila a normal house. It might just work although having it architectured is usually a better solution.

Let’s size it up 10 times.

Cathedral (highly complex software): You take a few bricks and mortar; throw them together and voila a … uhmm ... pyramid?

Without design a cathedral cannot support its size. The amount of materials is too big and the whole thing will collapse under its own weight. The only way to get a lean structure with so much empty space in it is via architecture and design. Complexity is not the problem, complexity without architecture and design is.

Navigation

[0] Message Index

[*] Previous page

Go to full version