ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

Need to store 5.5 Petabits long term? Try DNA.

<< < (2/2)

SeraphimLabs:
You wouldn't want to rely on a single molecule for a given dataset either though, that's just asking for single bit errors or worse. And it's easy enough to duplicate it that you might as well make copies, because with that kind of information density you could easily fit entire petabit raid volumes into a wrist watch sized package.

But my concerns would be related to storage and handling, and exactly how tolerant is it of things like radiation and heat.


f0dder:
But my concerns would be related to storage and handling, and exactly how tolerant is it of things like radiation and heat. -SeraphimLabs (August 22, 2012, 03:52 PM)
--- End quote ---
Yep - and though you do extreme redundancy because of the data density, you'd need to be able to compare a lot of copies in order to determine the correct bits... and the article mentions that both reading and writing is (currently) slower than normal media. You'd also want to store data at multiple sites, and then you have the classical problem of link speed between those sites - now just with insanely larger data collections :)

But it's interesting technology, and I definitely hope they'll get any kinks sorted out.

SeraphimLabs:
But my concerns would be related to storage and handling, and exactly how tolerant is it of things like radiation and heat. -SeraphimLabs (August 22, 2012, 03:52 PM)
--- End quote ---
Yep - and though you do extreme redundancy because of the data density, you'd need to be able to compare a lot of copies in order to determine the correct bits... and the article mentions that both reading and writing is (currently) slower than normal media. You'd also want to store data at multiple sites, and then you have the classical problem of link speed between those sites - now just with insanely larger data collections :)

But it's interesting technology, and I definitely hope they'll get any kinks sorted out.
-f0dder (August 22, 2012, 04:05 PM)
--- End quote ---

Yeah but it is done at the molecular level. All they have to do is design a molecule that pairs the molecules up for reading, in the process making any that are mismatched get knocked out of suspension so that they do not get read.

Then all the reader has to do is pick a confirmed molecule out of each batch of pairs, scan it, and send the data on it's way.

Essentially take one from nature, and engineer the rest of the organic processes involved in DNA handling. Although mutations might take place a little more often, it wouldn't take much to use the bacterial ability of copying DNA cell to cell to make the data self-repairing.

fenixproductions:
"sarcastic trolling"the Church team used commercial DNA microchips to create standalone DNA
-40hz (August 22, 2012, 12:08 PM)
--- End quote ---

To think they were burning witches not so long ago ;)

Navigation

[0] Message Index

[*] Previous page

Go to full version