Flopping into the future — and past

Flopping into the future — and past

Photo by Laura Ockel on Unsplash

Originally published 14 September 1998

Are you ready for ter­aflops? Picoseconds?

Wait! Before we get to tera and pico, let’s pause to remem­ber mega and micro.

It seems like only yes­ter­day that mega and micro were all the rage, whisk­ing us to new lim­its of technology.

Mega is the met­ric pre­fix for “mil­lion.” A typ­i­cal home com­put­er has megabytes of built-in mem­o­ry; that is, it can store mil­lions of alpha­bet­ic or numer­ic characters.

Micro is the pre­fix for one-mil­lionth. Com­put­ers do cal­cu­la­tions in mil­lionths of a second.

Com­put­er pio­neers like Steve Jobs reaped megabucks sell­ing us machines that did mega­jobs in microseconds.

How quick­ly things change. Today, mega and micro seem quaint­ly old hat, and giga and nano pre­side. Even bot­tom-of-the-line com­put­ers these days have hard dri­ves with giga­bytes of memory.

Giga is the met­ric pre­fix for “bil­lion” (a thou­sand million).

Bill Gates plumps his giga­buck for­tune sell­ing pro­grams that require giga­bytes of hard-disk memory.

The cal­cu­la­tion speeds of the fastest com­put­ers are described in nanosec­onds, or bil­lionths of a sec­ond, and engi­neers are talk­ing about using com­put­er-chip tech­nol­o­gy to build robot­ic machines that are only nanome­ters big. Nano is a thou­sand times small­er than micro.

Young techno­geeks now bab­ble about giga and nano the way we old­sters once talked about kilo and milli.

And even before we man­age to get com­fort­able with giga and nano, along come tera and pico.

Ter­aflop is the hot new word.

Tera is the pre­fix for tril­lion. Flop is short for “float­ing-point oper­a­tion,” a kind of com­put­er cal­cu­la­tion per­formed with a mov­able dec­i­mal point. A ter­aflop com­put­er can do a tril­lion float­ing-point oper­a­tions per sec­ond — a thou­sand gigaflops, a thou­sand thou­sand megaflops.

That aver­ages out to a cal­cu­la­tion every picosec­ond, or tril­lionth of a second.

Japan­ese engi­neers are design­ing a 32-ter­aflop machine that will con­sist of thou­sands of megaflop proces­sors linked in par­al­lel, all work­ing at once and exchang­ing infor­ma­tion with one another.

With such a machine, they hope to build a “vir­tu­al Earth,” a math­e­mat­i­cal mod­el of the Earth­’s oceans and atmos­phere that will pro­vide more pre­cise pre­dic­tions of phe­nom­e­na such as glob­al warm­ing, air and water pol­lu­tion, and El Niño weather.

Of course, this will not be the first dig­i­tal sim­u­la­tion of the Earth. The Nation­al Cen­ter for Atmos­pher­ic Research in Col­orado and the UK Mete­o­ro­log­i­cal Office in Britain, among oth­ers, have long been using sophis­ti­cat­ed com­put­er mod­els of the Earth­’s dynam­ic systems.

But 32 ter­aflops will bring us clos­er than ever to putting the real Earth in a box.

Mean­while, oth­er sci­en­tists are try­ing to get the entire uni­verse into a box. A multi­na­tion­al con­sor­tium of astro­physi­cists and com­put­er sci­en­tists have pro­grammed a super­com­put­er at the Max Planck Soci­ety’s com­put­ing cen­ter in Garch­ing, Ger­many, to sim­u­late the evo­lu­tion of the ear­ly universe.

The team is espe­cial­ly inter­est­ed in dis­cov­er­ing if slight den­si­ty vari­a­tions in the mat­ter of the ear­ly uni­verse, caused by “quan­tum fluc­tu­a­tions” in the Big Bang, might have ampli­fied to become the kinds of galac­tic clus­ters we observe today.

In this new sim­u­la­tion, 10 galax­ies are rep­re­sent­ed by a sin­gle dot of data. Of course, this is a huge sim­pli­fi­ca­tion of the real uni­verse — tril­lions of stars and plan­ets reduced to a dot! Nev­er­the­less, the pro­gram mim­ics the entire vis­i­ble uni­verse of galax­ies inter­act­ing over 10 bil­lion years of sim­u­lat­ed time.

Each run of the sim­u­la­tion requires about 70 hours of cal­cu­la­tion and gen­er­ates near­ly a ter­abyte of data show­ing how galax­ies were dis­trib­uted at var­i­ous moments of cos­mic time.

The com­put­er mod­el­ers try var­i­ous the­o­ret­i­cal sce­nar­ios for what the uni­verse was like in the ear­li­est days, and see which sort of begin­ning leads to a mature uni­verse that most close­ly match­es the one we observe with our telescopes.

His­tor­i­cal sci­ences, such as cos­mol­o­gy or evo­lu­tion, do not lend them­selves to exper­i­men­tal test­ing of the usu­al sort; after all, we can’t do exper­i­ments with the entire uni­verse in the lab­o­ra­to­ry, nor do we have bil­lions of real years to per­form exper­i­ments on evo­lu­tion. But we can build math­e­mat­i­cal sim­u­la­tions of cos­mic space and geo­log­ic time that tell us if our the­o­ries rea­son­ably account for the world we observe.

Will sci­en­tists ever build a uni­verse-in-a-box that con­tains stars, plan­ets, life, and mind, all inter­act­ing over sim­u­lat­ed bil­lions of years of time? For even the first steps in that direc­tion, even ter­aflop machines will not be enough. We must await the arrival of petaflop com­put­ers (a thou­sand teraflops).

Con­sid­er­ing how fast we pro­gressed from kilo to mega to giga to tera, can peta be far behind?

Share this Musing: