A discussion of the fact that tax day does not, contrary to some recent quotes, fall upon the Ides of April (which is today, the 13th, not the 15th) led to a back-of-the envelope calculation that a 64-bit integer field would be sufficient to calculate and uniquely identify any point in time since the estimated time of the Big Bang to millisecond precision. It was a simple step from there to calculate that approximately 84 bits wuold be required to do the same to cesium-beam atomic-clock resolution, and only slightly more complex to determine that with 140 bits to play with, one could enumerate time from the Big Bang to approximately six times the current estimated age of the Universe to quantum-time resolution (though, for obvious technical reasons, not with quantum-time precision).
Practical application of this calculation is left as an exercise for the reader.
Minor edit: The first site I referred to had an error in their derivation of the Planck distance; they substituted Planck's constant for Dirac's, which is probably why they were two orders of magnitude off. Enumeration of the life of the universe to date with quantum-time resolution should actually require around 144 bits.
no subject
So ~500 bits is a pretty scary large number.
Guess we won't be getting to 128 bit addressing anytime soon on a computer. :-)
no subject
no subject