On Fri, 29 Jun 2018 09:51:55 +0100 "Steve Simon" <steve(a)quintile.net>
wrote:
I know this is a dangerous game to play, but what of
the future?
I am intrigued by the idea of true optical computers,
perhaps the $10M super-computer will return?
One problem with optical is feature size. Even ultraviolet 200nm light
waves are pretty big compared to the feature size we now have in the
best chips (which is under 10nm at this point, though it will soon
stall out.) If you want EM waves that are close to the size of
current features, you're in the range of X-rays and aren't going to
have an easy time manipulating them.
All that said, a different kind of gedankenexperiment:
Say you wanted to store data as densely as possible. Lets ignore how
you would manage to read it and consider things on the same order of
magnitude of "as dense as we're going to get". If you stored 1s and
0s as C12 and C13 atoms in a diamond lattice, you get about
1.75e23 bits per cc, or about 12 zettabytes per cc. (Someone should
check my math.) I think it might be hard to do more than an order of
magnitude better than that. So, that's a crazy amount of storage, but
it looks like a pretty strong limit.
GPUs have definitely taken over in some areas - even
where
SIMD would not seem a good fit. My previous employer does motion
estimation of realtime video and has moved from custom electronics
and FPGAs to using off the shelf GPUs in PCs.
Not surprised to hear. They're kind of everywhere at this point,
especially in scientific computation.
Perry
--
Perry E. Metzger perry(a)piermont.com