ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

Scientists Claim They’ve Built a Computer That Never Crashes

(1/3) > >>

kyrathaba:
http://gizmodo.com/5984493/scientists-claim-theyve-built-a-computer-that-never-crashes

TaoPhoenix:
You know this is ITCHING for hacking.  ;D

Quote:'"Just don't expect it on your desktop any time soon. "

What T.H. good is that if we can't have it?

f0dder:
Sounds... convoluted. And one ting is "crashes", another your standard run-of-the-mill logic bugs - nothing can really do anything about those.

The executive overview doesn't really give much info, anyway. But it sounds like something that's going to be hard to make reach the speeds we're currently seeing - lots of duplicated units (that might be idle when not selected by the RNG, and thus wasting expensive silicon?), how is the data transfered back and forth, etc.

Also:
Even when it feels like your computer is running all your software at the same time, it is just pretending to do that, flicking its attention very quickly between each program," Bentley says.
--- End quote ---
...has Bentley been living under a rock for the last, dunno, 10 years? We've had multicore machines for quite a while. Sure, each core runs one step at a time, but cores do run in parallel :-)

Might be an interesting idea, but even in the newscientist story that gizmodo links to, there's not a lot of information. How is the system different from clock-synchronized failover systems that have also been around for quite a while (albeit in the pretty high-end of computing)?

Renegade:
My initial thought was, oh! They've re-discovered the abacus~! :D

It would be nice though... Still, I'm sure that NVIDIA and ATI could manage to crash it. They have years of experience! :P

f0dder:
It would be nice though... Still, I'm sure that NVIDIA and ATI could manage to crash it. They have years of experience! :P-Renegade (February 15, 2013, 06:37 PM)
--- End quote ---
To be fair, I don't recall seeing an nvidia driver *crash* (as in BSOD, not "things went flaky, screen flickered, and things were restored") since Vista, when Microsoft yanked big parts of the graphic driver stack back out from kernelmode to usermode. Used to be that way in the early NT days for über security, but most stuff was moved kernelmode in... NT4?... for performance reasons - CPUs were slow back then. Linux is finally catching up on (the right) split between kernel- and usermode, and OSX still has too much kernel (remember the Chrome-can-crash-OSX debacle last year? That was a mix of small bug in chrome, bigger bug in Intel video drivers, and ugly OSX architecture :-)).

Also, graphics drivers are insane these days, they're mini operating systems. Shader compilers, fragment schedulers and sundry.

Navigation

[0] Message Index

[#] Next page

Go to full version