The IBM 7090 Era in Numerical Weather Prediction (1959–1969)
The IBM 7090 Era in Numerical Weather Prediction (1959–1969)
Deep research for blog post covering the decade when NWP transitioned from experimental curiosity to indispensable tool – driven by transistorized mainframes, primitive equation models, satellites, and supercomputers.
1. The Computer Timeline at NMC/JNWPU
IBM 704 (installed 1957)
The IBM 704 replaced the IBM 701 at JNWPU in 1957. Its magnetic-core memory (mean time between failures of 7–9 hours versus the 701’s catastrophic 30-minute MTBF on Williams tubes) and hardware floating-point arithmetic were transformative. Combined with Shuman’s improved barotropic model and the Cressman objective analysis scheme (published 1959), the 704 era produced the first operationally useful forecasts. By 1958, the improved barotropic model was demonstrating steadily increasing skill; by 1960, NWP products were beginning to surpass manually computed forecasts at the 500-millibar level.
IBM 7090 (installed 1960)
The IBM 7090 was installed at the National Meteorological Center in 1960, replacing the IBM 704. It was IBM’s first commercially produced transistorized scientific computer – over 50000 germanium alloy-junction transistors on Standard Modular System (SMS) cards. Key performance specifications:
| Parameter | IBM 704 | IBM 7090 | Improvement |
|---|---|---|---|
| Logic technology | Vacuum tubes | Transistors (germanium) | Generational leap |
| Processing speed | ~40000 instructions/sec | ~100000 instructions/sec (~100 KFLOPS) | ~2.5x raw speed |
| Memory cycle time | 12 microseconds | 2.18 microseconds | ~5.5x |
| Memory capacity | 4096–32768 words | 32768 words | Same max |
| Reliability | MTBF ~7–9 hours | Significantly higher | Major improvement |
| Purchase price | ~$2000000 | ~$2900000 | – |
Note: The 7090 is typically described as “six times faster than the IBM 709” (its vacuum-tube predecessor), not six times faster than the 704. The 704-to-7090 improvement was roughly 2.5x in raw instruction throughput, but the faster memory cycle time and transistor reliability meant the effective improvement for sustained NWP runs was considerably larger.
The NOAA 200th anniversary website preserves a photograph captioned: “A meteorologist at the console of the IBM 7090 electronic computer in the Joint Numerical Weather Prediction Unit. This computer was used to process weather data for short- and long-range forecasts, analyses, and research.”
What the 7090 enabled: The additional speed allowed NMC to explore multi-level (baroclinic) models that had been computationally infeasible on the 704. A three-layer hemispheric model was introduced in 1962, representing the first operational step beyond simple barotropic forecasting.
IBM 7094 (installed ~1963)
The IBM 7094 (introduced September 1962) succeeded the 7090 at NMC around 1963. It featured 7 index registers (up from 3), double-precision floating-point, and was up to twice as fast as the 7090 for general computation. The 7094 II variant (first installed April 1964) added dual memory banks and pipelined execution, nearly doubling speed again.
By 1963, the first operational baroclinic model was running on the new IBM 7094 at NMC.
CDC 6600 (installed ~1965–1966)
The arrival of the CDC 6600 at NMC enabled the first hemispheric primitive equation model run in June 1966. The CDC 6600, designed by Seymour Cray at Control Data Corporation, was delivered starting in 1964 and remained the world’s fastest computer from 1964 to 1969. Key specifications:
| Parameter | Value |
|---|---|
| Designer | Seymour Cray |
| First delivery | 1964 (LLNL, LANL) |
| Speed | ~3000000 instructions/second (~3 MFLOPS) |
| Architecture | Single fast CPU + 10 peripheral processors |
| Transistors | ~250000 silicon transistors |
| Cooling | Chilled freon through logic modules |
| World’s fastest | 1964–1969 (succeeded by CDC 7600) |
The CDC 6600 was roughly 30x faster than the IBM 7090. This enormous leap in computational power was what made the primitive equation model operationally feasible – the PE model required solving the full hydrostatic equations at multiple atmospheric levels across a hemispheric grid, a task far beyond the capacity of the 7090-class machines within operational time constraints.
Summary Computer Timeline at NMC
| Year | Computer | Key NWP milestone |
|---|---|---|
| 1955 | IBM 701 | First operational NWP (May 6, 1955); disappointing results |
| 1957 | IBM 704 | Improved barotropic model; first useful skill by 1958 |
| 1960 | IBM 7090 | Multi-level models explored; three-layer model (1962) |
| ~1963 | IBM 7094 | First operational baroclinic model |
| ~1965–66 | CDC 6600 | Six-layer primitive equation model (June 1966) |
| 1970s | IBM 360/195 | High-resolution PE model (1978) |
Each successive computer was approximately 6 times more powerful than its predecessor – a pattern noted by Harper, Uccellini, et al. (2007) in their retrospective on 50 years of NWP.
2. The Transition from Barotropic to Baroclinic to Primitive Equation Models
Barotropic era (1955–early 1960s)
The first operational model at JNWPU (1955) was Charney’s three-level quasi-geostrophic model, but it produced disappointing results. After trying the Thompson-Gates two-layer thermotropic model (1956), JNWPU reverted to a simpler one-level barotropic model in 1958 – Shuman’s improved version with automated objective analysis (the Cressman scheme) and an octagonal Northern Hemisphere domain. This was the first model to demonstrate operationally useful skill.
Shuman wrote in his 1989 retrospective: “Within three years three agencies of the United States Government jointly created a numerical weather prediction service, but it was quickly discovered that current models had very serious defects. After considerable research, the first operationally effective model was achieved in 1958.”
The barotropic model solved a single vorticity equation at the 500-millibar level – essentially treating the atmosphere as a single layer. It captured the large-scale flow patterns that steer weather systems but could not represent thermal advection, frontogenesis, or the vertical structure of cyclones.
Remarkably, a derivative of this barotropic model – the “barotropic mesh model” – remained in operational use at NMC through 1985, long after far more sophisticated models had been developed.
Three-layer hemispheric model (1962)
By 1962, NMC introduced a three-layer hemispheric model, taking the first operational step toward baroclinic (multi-level) forecasting. This model could represent the vertical wind shear and thermal gradients that drive cyclone development – phenomena invisible to the barotropic model. It ran on the IBM 7090 and later the 7094.
The six-layer primitive equation model (June 1966)
The landmark advance came in mid-1966 when Shuman and Hovermale’s six-layer primitive equation (PE) model became operational at NMC, made possible by the CDC 6600.
What was different: Previous operational models (barotropic and quasi-geostrophic) used the vorticity equation as their central dynamical equation – an approximation that filtered out certain types of atmospheric motion. The PE model integrated the primitive (hydrostatic) hydrodynamic and thermodynamic equations directly, a fundamental departure. The primitive equations are the most complete set of equations that can be used without resolving sound waves, and they include:
- The horizontal momentum equations (with the Coriolis effect)
- The hydrostatic equation
- The thermodynamic energy equation
- The continuity equation
- The equation of state
The six reference levels were carefully selected to capture the vertical structure of the atmosphere, including the boundary layer, the jet stream level, and the stratosphere.
Results: In its first fourteen months of operational use, the PE model produced “highly significant improvements in the Center’s products.” The model could predict the development and movement of extratropical cyclones with far greater accuracy than the barotropic or quasi-geostrophic models because it captured the baroclinic instability mechanism – the fundamental process by which storms extract energy from the temperature contrast between tropical and polar air masses.
The model also improved wind and temperature forecasts at multiple levels, which was critical for aviation and military applications.
Publication: Shuman, F.G., and J.B. Hovermale, 1968: “An Operational Six-Layer Primitive Equation Model.” Journal of Applied Meteorology, 7(4), 525–547.
Later PE developments
- 7-layer PE model: An expanded version documented in “The Selection of the 7L PE for NMC Operations” (NMC Office Note).
- 9-layer global PE model (1972): In November 1972, NMC put into operation a fully global analysis-forecast system employing a Hough function spectral analysis and an 8-layer primitive equation forecast model on a 5-degree latitude-longitude grid.
- A “semi-implicit higher order version of the Shuman-Hovermale model” was later developed, improving computational efficiency.
3. The IBM 7094, “Daisy Bell,” and HAL 9000
The first computer to sing (1961)
In 1961, an IBM 7094 mainframe at Bell Telephone Laboratories in Murray Hill, New Jersey, performed the first computer singing – rendering “Daisy Bell” (also known as “Bicycle Built for Two”). The vocals were programmed by physicist John L. Kelly Jr. and Carol Lockbaum using vocoder speech synthesis, with musical accompaniment written by computer music pioneer Max Mathews.
Arthur C. Clarke witnesses the demonstration
Science fiction novelist Arthur C. Clarke visited Bell Labs at the invitation of his friend John R. Pierce (an electrical engineer and fellow science fiction writer who was a Bell Labs employee). Clarke witnessed the IBM 7094 singing “Daisy Bell” and was profoundly impressed.
HAL 9000 sings in 2001: A Space Odyssey
Clarke incorporated the singing computer into his 1968 novel and Stanley Kubrick’s 1968 film 2001: A Space Odyssey. In the story, “Daisy Bell” was one of the first things the HAL 9000 computer learned when it was originally programmed. In the film’s most famous scene, as astronaut Dave Bowman disconnects HAL’s higher brain functions, the dying AI regresses to its earliest memory and begins singing “Daisy Bell” – its voice slowing and deepening into a distorted drawl as its mind shuts down.
The NWP connection
The same IBM 7094 model that sang “Daisy Bell” at Bell Labs and inspired HAL 9000 was the machine running weather forecasts at NMC in Suitland, Maryland around 1963. And the same machine computed trajectories for NASA’s space program. Three of the defining applications of mid-century computing – weather prediction, space navigation, and artificial intelligence as cultural anxiety – all ran on the same hardware platform.
This is a rich thematic connection for the blog: the machine that calculated whether it would rain tomorrow also calculated how to reach the Moon, and also sang a song that became a metaphor for the terror of machine consciousness.
4. NASA and the 7090/7094: Space and Weather on the Same Hardware
Project Mercury (1961–1963)
NASA used two IBM 7090s at the Goddard Space Flight Center in Greenbelt, Maryland, to control the Mercury space flights. These machines computed powered flight trajectory parameters and real-time spacecraft position, transmitting continuous data to Mission Control displays. A backup IBM 709 operated from Bermuda.
Project Gemini (1965–1966)
Goddard operated three IBM 7094s for Gemini. The Jet Propulsion Laboratory had three additional 7094s in the Space Flight Operations Facility, plus two 7094/7044 direct-coupled systems.
Apollo (1968–1972)
One 7094 was retained during the Apollo program to run flight planning software that had not yet been ported to the newer IBM System/360 computers at Mission Control.
The parallel with NWP
The same class of machines – the IBM 7090 at NMC in Suitland and the IBM 7090s at Goddard in Greenbelt – were located roughly 15 miles apart in the Maryland suburbs of Washington, D.C. Both were solving systems of partial differential equations that described the behavior of fluid media (the atmosphere for weather; gravitational fields for orbital mechanics). Both required high-speed floating-point arithmetic and large amounts of core memory. Both were pushing the machines to their computational limits.
The cultural weight is significant: in the early 1960s, the United States was simultaneously using the same computers to predict the weather and to send humans into space. The narrative of “mastering nature through computation” was playing out in parallel, on identical hardware, in adjacent buildings.
Other notable 7090/7094 applications
- Erhard Glatzel used a 7090 to compute the Carl Zeiss Planar 50mm f/0.7 lens (later used by Kubrick to film by candlelight in Barry Lyndon).
- Michael Minovitch solved the three-body problem on UCLA’s 7090, laying the foundation for the Voyager missions’ Planetary Grand Tour.
- Daniel Shanks and John Wrench computed the first 100000 digits of pi on a 7090 (1962).
- SABRE airline reservation system ran on a pair of 7090s (1962).
5. The CDC 6600 Era at NMC
World’s first supercomputer
The CDC 6600, designed by Seymour Cray, is generally considered the first commercially successful supercomputer. It executed approximately 3 million instructions per second – about three times faster than the IBM 7030 Stretch (which was itself a special-order machine). The CDC 6600 remained the world’s fastest computer from 1964 to 1969, when it was succeeded by the CDC 7600 (also a Cray design).
The first units were delivered to Lawrence Livermore National Laboratory and Los Alamos National Laboratory in 1965. NCAR (the National Center for Atmospheric Research) took delivery of its CDC 6600 in late December 1965.
NMC’s CDC 6600
NMC acquired a CDC 6600 around 1965–1966. Its arrival made the Shuman-Hovermale six-layer primitive equation model operationally feasible. The PE model required approximately 30 times more computation than the barotropic model for the same domain (hemispheric coverage at 6 vertical levels instead of 1, plus the more complex equation set), and only the CDC 6600 could deliver results within the 2–3 hour operational time window.
The Navy’s parallel path: Fleet Numerical Weather Central
The U.S. Navy’s Fleet Numerical Weather Facility (FNWF), established in Monterey, California in 1961, followed a parallel computing trajectory. In 1967, FNWC installed a CDC 6500 – a dual-processor version of the CDC 6600. The CDC 6500 employed 250000 of the first silicon transistors, with chilled freon pumped through the logic circuit modules for cooling.
After acquiring a second CDC 6500 in 1969, FNWC staff (Philip G. Kesel and F. Winninghof) wrote “the world’s first multi-processor production code” – running the Northern Hemisphere Primitive Equation (NHPE) model across the four processors of two dual machines linked through shared extended core storage. This was one of the earliest examples of parallel computing for NWP.
6. Key NWP Milestones of the 1960s
First operational baroclinic model (~1962–1963)
NMC introduced a three-layer hemispheric model around 1962, running on the IBM 7090/7094. This was the first operational model to capture the vertical structure of the atmosphere, though it used quasi-geostrophic dynamics rather than the full primitive equations.
First hemispheric primitive equation model (June 1966)
Shuman and Hovermale’s six-layer PE model, running on the CDC 6600, became operational in mid-1966. It was the first model to integrate the primitive equations operationally over a hemispheric domain.
First global atmospheric model (research, 1963–1965; operational, 1972)
Joseph Smagorinsky and Syukuro Manabe at the Geophysical Fluid Dynamics Laboratory (GFDL) – originally the General Circulation Research Section of the U.S. Weather Bureau, renamed GFDL in 1963 – developed the first hemispheric primitive-equation general circulation model by 1963. Their nine-level, three-dimensional model was initially run on the IBM Stretch (IBM 7030), one of the most powerful computers of the early 1960s.
The landmark paper – Smagorinsky, Manabe, and Holloway, 1965: “Numerical results from a nine-level general circulation model of the atmosphere,” Monthly Weather Review, 93(12), 727–768 – described a model with many characteristics still familiar in modern GCMs: a fluid dynamical core solving the primitive equations on the sphere with enough horizontal resolution to explicitly resolve midlatitude storm structure.
NMC did not run a fully global operational model until November 1972, when it implemented a global analysis-forecast system.
TIROS-1 satellite (April 1, 1960)
TIROS-1 (Television Infrared Observation Satellite), launched on April 1, 1960, was the first successful meteorological satellite. Weighing approximately 270 pounds, it carried two television cameras and two video recorders. Over its 2.5-month lifespan, TIROS-1 returned 23000 photographs of the Earth, 19000 of them usable for weather analysis.
For the first time, it was possible to view large-scale cloud patterns in their entirety and identify storm regions from above. TIROS-1 provided data on smaller-scale features including tornadoes and jet streams.
Impact of satellite data on NWP
The integration of satellite data into NWP models was not immediate. The first experiments in satellite data assimilation began in the late 1960s. The key challenge was converting satellite observations (cloud images, radiance measurements) into the quantitative atmospheric variables (temperature, humidity, wind) that models required as initial conditions.
The Nimbus program (first launch 1964) carried increasingly sophisticated instruments including infrared spectrometers for deriving vertical temperature profiles. The Vertical Temperature Profile Radiometer (VTPR), mounted on NOAA-2 through NOAA-5 satellites, made routine atmospheric sounding observations from November 1972 to February 1979 – the first operational instrument for satellite temperature soundings.
Weather satellites became a critical data source for NWP during the 1960s and 1970s, particularly over the oceans and the Southern Hemisphere where conventional observations (radiosondes, surface stations) were sparse. The satellites filled what had been enormous data voids, enabling truly global models.
7. The Human Side: Forecasters versus Computers
The early disappointment
When NWP forecasts first arrived on forecasters’ desks in 1955, they were, in Shuman’s words, “very disappointing and not usable by forecasters.” The early computer forecasts “offered no competition with manually-produced forecasts.” This poor performance reinforced existing skepticism among veteran forecasters, who had spent decades developing intuitive pattern-recognition skills.
Institutional resistance
The Weather Bureau bureaucracy itself was not uniformly enthusiastic. Historians have characterized the Weather Bureau as “obdurate with no interest in modernization,” focused on short-term practical forecasting rather than theoretical understanding of atmospheric circulation. However, this characterization may be overstated – the Bureau did fund and support JNWPU from its establishment.
Sutcliffe’s lament (1956)
The most eloquent expression of forecaster ambivalence came from Reginald Sutcliffe, who directed NWP research at the UK Met Office. In 1956, Sutcliffe wrote:
“We are taught to look for the day when machines will calculate the future weather with a monotonous degree of success, but if that day comes one satisfying profession will be lost to man and we must look elsewhere. There will be little more joy in the trade than there is in the repetition of the multiplication table.”
Sutcliffe also declared himself a forecaster first: “I am happy to claim membership in the forecasters group, once a forecaster always a forecaster.” This is a remarkable statement from the person responsible for building the UK’s NWP program – expressing genuine grief at the prospect that his own work might render his profession meaningless.
The Swedish approach
Sweden, which operationalized NWP slightly before the United States, was also first to confront the human challenge. The Swedes developed a “User Guide” for human forecasters that recommended generally accepting NWP forecasts (given their known accuracy advantage) but identified specific areas where NWP was known to be problematic. One challenge noted was that NWP forecasts could change drastically from one day to the next, creating a messaging problem for forecasters explaining their predictions to the public.
The turning point (1958–1960)
The tide began to turn around 1958–1960:
- By 1958, the improved barotropic model with Cressman’s objective analysis was producing forecasts of steadily increasing skill.
- Weather Bureau forecasters began receiving NWP products in real time in 1958.
- By 1960, some NWP products were surpassing the quality of manually computed forecasts at the 500-millibar level.
But forecasters continued to view model output with skepticism – and with some justification. The models forecast upper-level flow patterns (500 millibar), but could not directly predict surface weather: rain, snow, temperature, wind at ground level. Translating an upper-level chart into a surface forecast still required human judgment. This gap between what the model could produce and what the public needed persisted throughout the 1960s and into the 1970s.
The “man-machine mix”
By the late 1960s, a consensus emerged that the future of weather forecasting would be what meteorologists called the “man-machine mix”: NMC would produce purely machine-generated upper-level guidance charts, while field forecasters would combine these with their local knowledge, satellite imagery, and surface observations to produce the final public forecast. The expectation was articulated in a 1969 Weather Bureau Western Region document: within approximately five years, NMC meteorological products would become “about 100% machine output,” but the final weather service would remain “a man-machine mix with the man predominating.”
This concept – machine guidance combined with human judgment – remained the fundamental paradigm of operational weather forecasting for decades, until the current era where machine learning models are again raising questions about the role of the human forecaster.
A parallel to today
The forecaster-versus-computer debate of the 1960s rhymes with today’s machine-learning NWP debate. In both cases, a new technology produces initially poor results, gradually improves, and eventually outperforms traditional methods – while practitioners who have invested careers in the old approach feel threatened and skeptical. The difference is that the 1960s debate was resolved by the “man-machine mix” compromise; it remains to be seen whether the 2020s AI revolution will lead to a similar equilibrium or to more radical displacement.
8. Primary Sources and Key References
Shuman 1989 – the definitive insider history
Shuman, F.G., 1989: “History of Numerical Weather Prediction at the National Meteorological Center.” Weather and Forecasting, 4(3), 286–296. https://journals.ametsoc.org/view/journals/wefo/4/3/1520-0434_1989_004_0286_honwpa_2_0_co_2.xml
The single most important source for the NMC computer/model timeline. Written by the man who was JNWPU’s first employee and NMC director for 17 years.
Shuman and Hovermale 1968 – the PE model paper
Shuman, F.G., and J.B. Hovermale, 1968: “An Operational Six-Layer Primitive Equation Model.” Journal of Applied Meteorology, 7(4), 525–547. https://journals.ametsoc.org/view/journals/apme/7/4/1520-0450_1968_007_0525_aoslpe_2_0_co_2.xml
Harper et al. 2007 – 50th anniversary retrospective
Harper, K., L. Uccellini, E. Kalnay, K. Carey, and L. Morone, 2007: “50th Anniversary of Operational Numerical Weather Prediction.” Bulletin of the American Meteorological Society, 88(5), 639–650. https://journals.ametsoc.org/view/journals/bams/88/5/bams-88-5-639.xml
Yang 2025 – “The First Compute Arms Race”
Yang, Charles, 2025: “The First Compute Arms Race: the Early History of Numerical Weather Prediction.” https://charlesyang.io/assets/Supercomputing_and_Weather_Forecasting.pdf / https://arxiv.org/html/2506.21816v1
Excellent secondary source covering the U.S., UK, Sweden, Canada, and Japan NWP programs. Strong on institutional dynamics and forecaster resistance.
EMC History Page
“Environmental Modeling Center (EMC) History.” NCEP. https://www.emc.ncep.noaa.gov/emc/pages/ourhistory.php
Official timeline of models and computers at NMC/NCEP.
EMC Model History Table
“NCEP Model History Table.” EMC. https://www.emc.ncep.noaa.gov/emc/pages/model_history/modeltables/modelTable.php
Cressman oral history (1992)
Transcript of oral history interview with George Cressman, August 24, 1992. Interviewers: Warren Washington, Norman Phillips, Ron McPherson, Jim Howcroft. UCAR OpenSky: https://opensky.ucar.edu/islandora/object/:34962 / NOAA Digital Collections: https://www.noaa.gov/digital-collections/noaa-voices/george-cressman
GFDL brief history of global atmospheric modeling
“Brief History of Global Atmospheric Modeling at GFDL.” https://www.gfdl.noaa.gov/brief-history-of-global-atmospheric-modeling-at-gfdl/
Smagorinsky, Manabe, and Holloway 1965
Smagorinsky, J., S. Manabe, and J.L. Holloway, Jr., 1965: “Numerical results from a nine-level general circulation model of the atmosphere.” Monthly Weather Review, 93(12), 727–768.
NOAA Mariners Weather Log (2007)
“The History of Numerical Weather Prediction.” Mariners Weather Log, Vol. 51, No. 3, December 2007. https://www.vos.noaa.gov/MWL/dec_07/weatherprediction.shtml
NOAA 200th anniversary – IBM 7090 image
https://celebrating200years.noaa.gov/visions/climate/image2.html
Fleet Numerical Weather Central history
Laws, David A., “Weather Prediction Goes Digital,” Core+ (Computer History Museum), Medium. https://medium.com/chmcore/weather-prediction-goes-digital-596f5425c043
“Daisy Bell” and the IBM 7094
“The IBM 7094 is The First Computer to Sing.” History of Information. https://www.historyofinformation.com/detail.php?id=3986
“How an IBM Computer Learned to Sing (1961).” Ted Gioia, The Honest Broker. https://www.honest-broker.com/p/how-an-ibm-computer-learned-to-sing
“Daisy Bell.” Wikipedia. https://en.wikipedia.org/wiki/Daisy_Bell
TIROS-1
“Celebrating 65 Years of the World’s First Weather Satellite.” NESDIS, NOAA. https://www.nesdis.noaa.gov/news/celebrating-65-years-of-the-worlds-first-weather-satellite
“TIROS-1.” Wikipedia. https://en.wikipedia.org/wiki/TIROS-1
CDC 6600
“CDC 6600.” Wikipedia. https://en.wikipedia.org/wiki/CDC_6600
“CDC 6600.” NCAR Supercomputing History. https://www.cisl.ucar.edu/ncar-supercomputing-history/cdc6600
“CDC 6600’s Five Year Reign.” Computer History Museum Revolution. https://www.computerhistory.org/revolution/supercomputers/10/33
IBM 7090
“IBM 7090.” Wikipedia. https://en.wikipedia.org/wiki/IBM_7090
NASA and IBM 7090/7094
“Project Mercury.” IBM Archives. https://www.ibm.com/history/mercury
“Apollo.” IBM Archives. https://www.ibm.com/history/apollo
“IBM 7094: NASA & Air Force Computing in 1960s.” FedTech Magazine. https://fedtechmagazine.com/article/2016/10/how-ibm-7094-gave-nasa-and-air-force-computing-superiority-1960s
“Space Trajectories Program for the IBM 7090 Computer.” NASA Technical Reports Server. https://ntrs.nasa.gov/citations/19660007303
WPC History
“Summary of WPC History and Leadership, September 2025.” https://www.wpc.ncep.noaa.gov/html/WPC_history.pdf
Satellite data in NWP
Eyre, J., 2022: “Assimilation of satellite data in numerical weather prediction. Part I: The early years.” Quarterly Journal of the Royal Meteorological Society.
9. Blog Post Narrative Threads
Thread A: The six-times-faster rule
Each computer upgrade at NMC delivered approximately 6x more power, enabling a qualitative leap in model sophistication:
- 701 -> 704: From crashing every 30 minutes to completing a forecast
- 704 -> 7090: From barotropic to three-layer baroclinic
- 7090/7094 -> CDC 6600: From quasi-geostrophic to primitive equations
- Each step up the hardware ladder unlocked physics that had been computationally out of reach
Thread B: Weather and space – the same machine, different equations
The IBM 7090 at NMC (Suitland) and the IBM 7090s at Goddard (Greenbelt) – 15 miles apart – both solving PDEs to predict the behavior of physical systems. One predicted whether it would rain; the other predicted whether John Glenn’s capsule would splash down in the right ocean.
Thread C: Daisy Bell as metaphor
The IBM 7094 sang, calculated orbits, and predicted storms. Arthur C. Clarke turned the singing into HAL 9000’s death scene – the machine regressing to its earliest, most innocent memory as it dies. Meanwhile, the real 7094 at NMC was doing something arguably more consequential: learning to predict the atmosphere well enough to save lives.
Thread D: The forecaster’s dilemma
Sutcliffe’s 1956 lament is hauntingly relevant today. Replace “machines” with “AI models” and his words could appear in a 2025 editorial. The 1960s “man-machine mix” resolution – where human judgment remained primary and computer guidance was advisory – held for decades. The current AI revolution in NWP may or may not lead to a similar compromise.
Thread E: The CDC 6600 – computing’s inflection point
Seymour Cray’s CDC 6600 was the machine that made modern weather prediction possible. The jump from 100 KFLOPS (7090) to 3 MFLOPS (6600) was a 30x improvement – roughly equivalent to moving from a Model T to a Ferrari. And NWP was one of the applications that justified the supercomputer’s existence.
Research compiled: 2026-04-08 Word count: approximately 3200 words