Hubbry Logo
CDC 1604CDC 1604Main
Open search
CDC 1604
Community hub
CDC 1604
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
CDC 1604
CDC 1604
from Wikipedia
CDC 1604
CDC 1604 with a figure as scale
Design
ManufacturerControl Data Corporation
DesignerSeymour Cray
Release date1960 (1960)
Units sold50+
Price$ 1,030,000 (192 kilobytes)[1]
Casing
DimensionsHeight : 176 cm (69 in)
Length : 227 cm (89 in)
Width : 68 cm (27 in) [2]
Weight2,200 pounds (1,000 kg)
Power5.5 kW @ 208 V 60 Hz[2]
System
Operating systemCo-Op Monitor (developed by the users' organization)
CPU48-bit processor @ 208 kHz[2]
Memory192 kilobytes (32767 x 48bits)[2]
Storage-
MIPS0.1
FLOPS-
Predecessor-
SuccessorCDC 3600, 3800 and 3400

The CDC 1604 is a 48-bit computer designed and manufactured by Seymour Cray and his team at the Control Data Corporation (CDC). The 1604 is known as one of the first commercially successful transistorized computers. (The IBM 7090 was delivered earlier, in November 1959.) Legend has it that the 1604 designation was chosen by adding CDC's first street address (501 Park Avenue) to Cray's former project, the ERA-UNIVAC 1103.[3]

A cut-down 24-bit version, designated the CDC 924, was shortly thereafter produced, and delivered to NASA.[4]

The first 1604 was delivered to the U.S. Navy Post Graduate School in January 1960[5] for JOVIAL applications supporting major Fleet Operations Control Centers primarily for weather prediction in Hawaii, London, and Norfolk, Virginia. By 1964, over 50 systems were built. The CDC 3600, which added five op codes, succeeded the 1604, and "was largely compatible" with it.[6]

One of the 1604s was shipped to the Pentagon to DASA (Defense Atomic Support Agency) and used during the Cuban missile crises to predict possible strikes by the Soviet Union against the United States.

A 12-bit minicomputer, called the CDC 160, was often used as an I/O processor in 1604 systems. A stand-alone version of the 160 called the CDC 160-A was arguably the first minicomputer.[7]

Architecture

[edit]
2-views drawing of a CDC 1604 with scaling
2-views drawing of a CDC 1604 with scaling
CDC 1604 registers
47 . . . 14 . . . 00 (bit position)
Operand registers (48 bits)
A Accumulator
Q Auxiliary Arithmetic register
Program counter (15 bits)
  P Program counter
Index registers (15 bits)
  1 Index 1
  2 Index 2
  3 Index 3
  4 Index 4
  5 Index 5
  6 Index 6

Memory in the CDC 1604 consists of 32K 48-bit words of magnetic core memory with a cycle time of 6.4 microseconds.[6] It is organized as two banks of 16K words each, with odd addresses in one bank and even addresses in the other. The two banks are phased 3.2 microseconds apart, so average effective memory access time was 4.8 microseconds. The computer executes about 100,000 operations per second.

Each 48-bit word contains two 24-bit instructions. The instruction format is 6-3-15: six bits for the operation code, three bits for a "designator" (index register for memory access instructions, condition for jump (branch) instructions) and fifteen bits for a memory address (or shift count, for shift instructions).

The CPU contains a 48-bit accumulator (A), a 48-bit Auxiliary Arithmetic register (Q), a 15-bit program counter (P), and six 15-bit index registers (1-6).[8] The Q register was usually used in conjunction with A for forming a double-length register AQ or QA, participating with A in multiplication, division and logical product (masking) operations, and temporary storage of A's contents while using A for another operation.[9]

Internal integer representation uses ones' complement arithmetic. Internal floating point format is 1-11-36: one bit of sign, eleven bits of offset (biased) binary exponent, and thirty-six bits of binary significand.[10]

The most-significant three bits of the accumulator are converted from digital to analog and connected to a tube audio amplifier contained in the console. This facility could be used to program audio alerts for the computer operator, or to generate music. Those familiar with the inner workings of the software could often hear what parts of a task were being performed by the CDC 1604; as a debugging aid, for example, a never-ending repetitive musical phrase indicated the program was stuck in a loop.

Uses and applications

[edit]

In 1960, one of the first text-mining applications, Masquerade, was written for the Marathon Oil Company in Findlay, Ohio. Masquerade was a text-mining program that used syntactic structures underlying text data to mask out words and phrases for searching purposes.[11] During 1969, Fleet Operations Control Center, Pacific (FOCCPAC at Kunia) on Oahu in Hawaii launched an Automated Control Environment (ACE) using a cluster of five CDC 160As to supervise a multi-tasking network of four CDC 1604s.

The Minuteman I was the first U.S. solid-rocket ICBM system to be fielded. There were two entirely separate ground station designs which were developed independently. The smaller, more elegant, single silo design incorporated two redundant CDC 1604 computer systems, each equipped with dual cabinets containing four 200 bpi magnetic tape drives. The computers were used to pre-compute guidance and aiming control information. Results based on current weather and targeting information were downloaded into the missile prior to launch. Model displays of both of these ICBM ground station designs, including block models of the CDC 1604 computers, may be viewed at the Octave Chanute Aerospace Museum in Rantoul, Illinois.

The third version of the PLATO computer-based educational system was implemented on a CDC 1604-C.[12]

JOVIAL was used as the main programming language of the CDC 1604, while octal was used to program shared services supported by the CDC 160A.[13] NAVCOSSACT based at the Washington Navy Yard provided systems and training support.

The CDC 1604 was used to compose Sailboat and other artworks by Sam Schmitt and Stockton Gaines.[14]

Similar machines

[edit]

The 1604 design was used by the Soviet nuclear weapons laboratory. Their BESM-6 computer, which entered production in 1968, was designed to be somewhat software compatible with the CDC 1604,[15] but it ran 10 times faster and had additional registers.

The 924

[edit]
CDC 924
CDC 924 with scaling
Design
ManufacturerControl Data Corporation
DesignerSeymour Cray
Release date1961 (1961)[16]
Units sold12+ (1964)
Price$ 180,000[1]
Casing
DimensionsHeight : 173 cm (68 in)
Length : 157 cm (62 in)
Width : 66 cm (26 in) [17]
Weight1,430 pounds (650 kg) [17]
Power2.3 kW @ 208 V 60 Hz[17]
System
Operating system-
CPU24-bit processor @ 188 kHz
Memory24 kilobytes (8192 x 24bits)[17]
Storage-
MIPS-
FLOPS-
Predecessor-
SuccessorCDC 3000

The CDC 924 is a 24-bit computer that supported the use of "any input-output devices capable of communicating with the 160 and/or 1604 computer,"[18] and its six independent channels permitted three simultaneous input operations even as three channels concurrently performed output.

Like many CDC processors,[8] it used ones' complement arithmetic.

Some advanced features of the 924, which included 64 instructions, were:

  • Six index registers. The value "7" was reserved to indicate indirect-addressing.
  • an execute instruction (in what the hardware reference manual called "a subroutine of a single instruction").[18]: p. 2–41 
  • powerful Storage Search instructions.: pp. 2-32 thru 2-35 

See also

[edit]

References

[edit]
[edit]

Further reading

[edit]

Photos

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The CDC 1604 was a 48-bit transistorized computer system developed and manufactured by (CDC), released on October 16, 1959, as the company's first commercial product and recognized as the world's fastest computer at the time. It featured 32,768 words of with a 6.4-microsecond cycle time, parallel arithmetic operations including addition in 1.2 microseconds, and support for 62 instructions encoded two per 48-bit word, enabling real-time data processing for applications such as weapons systems control and large-scale scientific computations. Designed primarily by engineer —who later pioneered supercomputers—the system included six index registers, , indirect addressing, and three input/output channels, all housed in a compact cabinet measuring approximately 5 feet 8 inches high, 7 feet 5 inches wide, and 2 feet 3 inches deep, weighing 2,200 pounds. Founded in 1957 by William Norris and a team of former Sperry Rand engineers, CDC targeted the growing demand for solid-state computing beyond vacuum-tube limitations, with the marking a shift toward transistor-based designs that improved reliability and speed for military and commercial users. The first unit was delivered to the U.S. Navy Postgraduate School in January 1960 for programming in fleet operations, underscoring its role in advancing real-time control systems during the early era. Its architecture, including a single-address format with 6-bit opcodes and support for single-precision floating-point and double-length arithmetic operations, influenced subsequent CDC models like the 3600 series and laid groundwork for Cray's later innovations at CDC and beyond, contributing to the evolution of .

History and Development

Design Origins

The design of the CDC 1604 was heavily influenced by earlier vacuum-tube-based computers, such as the ERA 1103 (later commercialized as the UNIVAC 1103), which provided foundational concepts in scientific computing but suffered from the limitations of tube technology. These predecessors, developed in the early 1950s under military contracts, emphasized high-speed data processing for cryptologic and scientific applications, but their reliance on vacuum tubes led to frequent failures due to heat generation and short component lifespans. The CDC 1604 represented a pivotal transition to all-transistorized construction, leveraging germanium transistors to achieve greater reliability, reduced power consumption, and faster switching speeds—up to 200 kilocycles per second—while minimizing the overheating issues that plagued tube-based systems. Development of the CDC 1604 commenced in 1957 at the newly founded (CDC), driven primarily by U.S. military demands for compact, high-performance computers capable of processing in defense applications such as and atomic support. The project addressed key challenges of the era, including the need for modular architectures that facilitated easier maintenance and upgrades, contrasting with the bulky, non-standardized designs of vacuum-tube machines. This modularity was achieved through standardized circuit modules and a chassis-based layout, allowing technicians to swap components without extensive rewiring, which improved system uptime in demanding military environments. Initial specifications for the CDC 1604 included a 48-bit word length, chosen to support high-precision scientific calculations—such as those in —without routinely requiring double-precision operations that would slow performance on shorter-word machines. Germanium transistors were selected for their superior efficiency in early solid-state designs, offering low power dissipation and high gain suitable for the machine's parallel processing needs, though they were later supplemented in successor systems with faster variants.

Introduction and Production

The CDC 1604, developed by Control Data Corporation (CDC), was publicly announced on October 16, 1959, marking the company's entry into the commercial computing market with its first major product. The system represented a significant advancement as one of the earliest fully transistorized computers available for commercial purchase, designed primarily by engineer Seymour Cray to address the reliability and heat issues inherent in vacuum tube-based machines. The first unit was delivered in January 1960 to the U.S. Navy Postgraduate School in Monterey, California, where it supported JOVIAL programming for fleet operations and scientific applications. Full-scale production began ramping up later that year at CDC's facilities in Minneapolis, Minnesota, following an initial order from the U.S. Navy Bureau of Ships in June 1959. Production of the CDC 1604 employed modular assembly techniques, allowing for efficient construction of its core components, including transistor logic and . By 1964, approximately 50 units had been produced and installed, primarily for high-end users due to the system's complexity and cost. Priced at around $700,000 per system—depending on configuration—the CDC 1604 was targeted exclusively at government agencies, military installations, and large research institutions capable of affording such investments. This limited rollout reflected the era's nascent market for transistorized computing, where demand was driven by needs for processing and scientific computation rather than broad commercial adoption. Initial reception of the CDC 1604 was highly positive within scientific and defense communities, with centered on its status as the first large-scale, fully transistorized computer to achieve commercial viability and superior performance over predecessors. Users lauded its reliability, speed—capable of approximately operations per second—and compact design compared to earlier systems, which often required extensive cooling and maintenance. The machine's success helped establish CDC as a leader in , paving the way for subsequent models and demonstrating the practical advantages of solid-state technology in overcoming the limitations of tube-based architectures.

Key Contributors

Seymour Cray led the design of the CDC 1604 as its principal architect at Control Data Corporation (CDC), where he directed the overall system architecture and spearheaded the transition from vacuum tubes to germanium transistors, marking one of the first commercial implementations of solid-state technology in a high-performance computer. His prior experience at Engineering Research Associates (ERA), a firm focused on advanced computing for military applications, shaped his emphasis on achieving maximum computational speed through innovative circuitry and reduced complexity. Supporting Cray was a core team of engineers at CDC, including Frank Melaney, George Henson, and James Thornton, who had collaborated with him since days and contributed to critical aspects such as circuit implementation and . These engineers helped translate Cray's concepts into a functional machine, with particular attention to reliable transistor-based logic that enabled the 1604's high-speed operations. The project's design was influenced by requirements from U.S. government agencies, including the (NSA), which became an early major customer and acquired multiple units for secure ; this emphasized features like a simplified instruction set for enhanced reliability in sensitive environments. Cray's underlying philosophy centered on minimalism—eschewing unnecessary components to optimize performance and reduce failure points—resulting in an elegant interconnection system and a RISC-like architecture that prioritized efficiency over complexity.

Technical Architecture

Processor Design

The CDC 1604 employed a 48-bit processor architecture designed for binary operations in a stored-program configuration. Operating at an effective clock speed of 208 kHz (with a 4.8 µs cycle time), it represented a leap in computational efficiency for its era, enabling high-speed arithmetic and data transfer within its single-address logic framework. The processor utilized thousands of germanium transistors—estimated at around 25,000 in total—combined with and amplifiers to form a fully solid-state , eliminating the reliability issues and bulk of vacuum-tube predecessors. This transistor-based implementation supported modular construction, with logic elements mounted on printed circuit cards and organized into stackable modules for maintainability and scalability. The employs arithmetic for fixed-point operations. Central to the design was a modular logic structure separating the arithmetic unit, , and to facilitate concurrent operations. The arithmetic unit featured a 48-bit accumulator (A register) and a 48-bit auxiliary register (Q register), which handled fixed- and , logical operations, and masking under modulus 24812^{48} - 1. Meanwhile, the , incorporating a 15-bit (P register) and program (U1), orchestrated instruction fetch, decode, and execution sequencing. This separation allowed overlapping of arithmetic computations with control functions, enhancing overall throughput in representative programs. A set of six 15-bit index registers (B1 through B6) provided additional flexibility for address arithmetic and loop control. Addressing capabilities included direct and indirect modes, with index register modification to support access to the system's maximum of 32,768 48-bit words of organized in two interleaved banks. Instructions were packed two per 48-bit word, promoting dense code and efficient memory utilization during program execution. Power consumption for the processor and associated core logic totaled approximately 5 kW at 208 V, 60 Hz—a dramatic reduction from the 50 kW or more typical of contemporary vacuum-tube computers like the , due to the lower heat and energy demands of transistorized components.

Memory System

The CDC 1604 utilized as its main storage, a technology prevalent in mid-20th-century computers for its reliability and non-volatility. This core memory provided a capacity of 32,768 48-bit words, enabling the system to handle substantial datasets for scientific and computations of the era. The memory was organized into two independent, alternately phased banks of 16,384 words each—one for odd addresses and one for even—facilitating interleaved access to improve overall performance during program execution. Access to the core memory involved a destructive readout process, where reading a word altered its state in the cores, necessitating an immediate rewrite to preserve the data; this required a refresh mechanism integrated into the . Each memory bank had a cycle time of 6.4 microseconds, comprising a 2.2-microsecond read access followed by write and recovery phases, resulting in an effective average cycle time of 4.8 microseconds for random accesses or 3.2 microseconds when alternating between banks. Words were read directly into the processor's X register and written back using the Z1 and Z2 registers, ensuring seamless integration with the central processing unit's operations. Error detection in the core memory relied on built-in fault indicators displayed on the operator's console, such as "Odd Storage Fault" and "Even Storage Fault" lights, which signaled issues in specific banks and halted operations until a master clear was performed to reset the system. This hardware-level monitoring helped maintain without advanced correction mechanisms, aligning with the era's emphasis on robust but simple reliability features in transistor-based systems.

Instruction Set

The CDC 1604 employs a fixed-length 24-bit instruction format, allowing two instructions to be packed into each 48-bit word for efficient utilization of storage. Each instruction consists of a 6-bit operation code specifying the operation, a 3-bit designator field that selects an index register (B1 through B6) for modification or indicates indirect addressing (designator 7), and a 15-bit base execution that forms the effective when combined with the designator. This structure supports direct, indexed, and indirect addressing modes, with the base typically referring to the accumulator (A register) or (Q register) for access. The system includes 62 basic instructions, encoded with unique 6-bit opcodes ranging from 01 to 76 (opcodes 00 and 77 trigger faults). These are categorized into arithmetic, , control, and operations, emphasizing a load-store where manipulation occurs primarily between the A and Q registers and . Arithmetic instructions operate on 48-bit fixed-point or floating-point values, using the combined A-Q register pair; for example, the add instruction (opcode 14) adds a to the A register, while (15) performs , and multiply (24) computes a 96-bit product by multiplying the A register contents with a , storing the high 48 bits in Q and low 48 bits in A. Floating-point variants, such as floating add (30) and floating multiply (32), handle operands in a 1-11-36 format (sign bit, 11-bit biased exponent, 36-bit significand) without requiring separate extensions. Logical instructions perform bit-level operations on 48-bit words in the A register. Key examples include AND (44), which bitwise ANDs the A register with a ; selective set (40), functioning as an OR by setting bits in A where the has 1s; and masking operations (41–43) for clearing or complementing bits. Shift instructions (opcodes 01–07) provide single- or double-length logical, arithmetic, or circular shifts on A or the A-Q pair, supporting normalization and alignment for arithmetic tasks. Control instructions manage program flow and execution. Unconditional jumps (22 or 23) transfer control to the specified address, optionally clearing or preserving the A register; conditional jumps (75 or 76) branch based on A-register flags like zero or overflow, with 76 also serving as halt to stop execution. Input/output instructions (e.g., 62 for transfer) interface with peripherals using the external function . The instruction set's simplicity, with uniform fixed-length encoding and hardwired implementation, facilitates rapid decoding and execution without , influencing later reduced instruction set designs. Programming typically involves a basic symbolic that maps mnemonics to opcodes, supporting both fixed- and floating-point operations natively, though optional software libraries could extend precision handling.

System Configuration

Input/Output Integration

The CDC 1604 utilized the CDC 160, a 12-bit , as a dedicated I/O controller to manage data transfers between the central processor and external peripherals, incorporating (DMA) capabilities to minimize main CPU involvement in I/O operations. This setup allowed the CDC 160 to process interrupts from peripherals and execute DMA transfers independently, supporting high-speed data exchange at rates up to 70,000 words per second while the main processor continued computations. The system's channel architecture featured up to six independent buffer channels—three for input and three for output—enabling concurrent I/O operations that overlapped with CPU execution and reduced overall system overhead. A seventh high-speed transfer channel further facilitated rapid bulk data movement, with each channel interrogated by a scanner to detect readiness from connected devices, ensuring efficient without stalling the processor. The interrupt system was priority-based, allowing real-time responses to I/O events such as buffer completion or peripheral status changes, with the CDC 160 handling multiple interrupt lines in sequence to prioritize critical tasks. Interrupts triggered automatic jumps to designated memory locations for servicing, followed by a return to the main program after lockout clearance, supporting seamless integration in multi-device environments. A standalone variant, the CDC 160-A, provided independent I/O processing capabilities, functioning as a desk-sized unit for off-line tasks or as a dedicated controller without reliance on the main 1604 system. This configuration enhanced flexibility for shared peripheral resources, such as units, across multiple computing setups.

Peripherals and Expansion

The CDC 1604 supported a range of standard peripherals designed for , and bulk storage, enabling it to function as both a scientific and a . These included tape readers and punches for program loading and output, which were common for the era due to their reliability and low cost. The photoelectric tape reader operated at 350 characters per second, facilitating rapid input of or data from 8-level . The associated tape punch, typically a Teletype BRPE model adapted for the , produced output at 60 characters per second, suitable for generating archival tapes or diagnostic logs. For printed output, the CDC 1604 could integrate high-speed line printers achieving speeds of up to 1,000 lines per minute with 120 print positions, or slower alternatives like the 717 printer at 150 lines per minute. Card readers processed standard 80-column punched cards at 150 cards per minute, providing an alternative input method for tasks. These peripherals connected via the system's external function instructions and I/O channels, allowing seamless integration without interrupting core computations. Bulk storage was handled by the Model 1607 System, which utilized FR307 handlers with 7-track, 1/2-inch tape at a of 200 characters per inch and a transport speed of 150 inches per second, yielding a transfer rate of 30,000 characters per second. Each subsystem cabinet housed four tape drives, supporting up to 2,500-foot reels, and the allowed for expansion to six such cabinets—accommodating a total of 24 drives—for large-scale archival and transfer in scientific simulations or . System expansion was facilitated through modular cabinets for peripherals and I/O equipment. The operator console featured a Teletype or modified IBM Model B electric typewriter for real-time interaction, displaying status and errors while providing switches and indicator lights for basic diagnostics, such as memory dumps or instruction stepping.

Operating Environment

The CDC 1604 operated without a comprehensive operating system, relying instead on rudimentary monitor programs to oversee basic system functions such as program loading and execution. The primary monitor software was the CO-OP Monitor, a secondary control system developed for batch processing environments, which managed job scheduling, input/output operations, and program sequencing in coordination with a master control system (MCS). This monitor facilitated multi-job execution by automatically assigning I/O units, handling library edits via utilities like LIBEDIT, and providing error recovery mechanisms, including diagnostics for issues such as checksum or directory errors. Programming support centered on a compiler tailored for scientific applications, with development led by a dedicated software team under at ; this compiler, compatible with FORTRAN II standards, enabled efficient floating-point computations and was essential for numerical simulations and data processing tasks. System-level programming utilized assembly languages like CODAP1, which generated absolute or relocatable for low-level control and optimization. Program loading typically involved card-deck or paper tape loaders, initiated manually via the console or pre-stored routines, with operator intervention required to switch between jobs in multi-user configurations lacking automated multitasking. Built-in diagnostic capabilities included interrupt-driven fault indicators and console-based testing routines, allowing operators to verify hardware integrity through light panels and manual controls without external tools.

Applications and Impact

Military and Government Uses

The CDC 1604 found its primary early adopters in U.S. military and intelligence agencies, with the U.S. Navy receiving the first unit in January 1960 at the in , for applications supporting fleet operations control centers using the programming language. The (NSA) also became a key user starting in September 1960, acquiring multiple systems—including CDC-1604(1) with a special WELCHER attachment for targeted analytic tasks, followed by units in February 1961, March 1962, January 1963, and July 1963—deployed for research computing and specialized problem-solving in cryptologic environments. These deployments leveraged the machine's transistor-based design, influenced by prior cryptologic projects like BOGART, to handle intensive computational demands in secure settings. A notable installation occurred at the U.S. in , where a CDC 1604 was delivered in December 1960 to support ordnance-related computations, including missile trajectory analysis as part of broader defense simulations. Additionally, a militarized variant known as the Digital GeoBallistic Computer, based on the CDC 1604 architecture, was adapted for targeting systems, performing real-time guidance and trajectory calculations in naval applications. Such configurations emphasized reliability in high-stakes military simulations, drawing on the system's inherent stability for uninterrupted operations in defense scenarios. For classified work, the CDC 1604 incorporated custom attachments and architectural elements suited to secure processing, such as specialized peripherals for controlled data handling, enabling its use in NSA's analytic facilities without explicit public disclosure of encryption mechanisms. In these environments, the computer achieved its rated performance of approximately 100,000 operations per second, facilitating rapid execution of cryptographic algorithms and simulation models critical to tasks.

Scientific and Commercial Applications

The CDC 1604 found significant application in scientific research at universities, where it supported physics simulations and starting in the early 1960s. At the University of Illinois' Coordinated Science Laboratory, the system powered computational tasks in and control systems, enabling researchers to model physical phenomena and analyze experimental data with its 48-bit architecture for high-precision calculations. This setup facilitated advancements in and physics-related simulations, contributing to joint programs in scientific . Additionally, acquired a CDC 1604 in 1962, using it for complex scientific computations such as nuclear weapons simulations. In the commercial sector, the CDC 1604 was adopted by oil companies for seismic during the early , aiding exploration efforts through efficient handling of geophysical datasets. For instance, University Computing Corporation utilized a CDC 1604 in to provide processing services to engineers at Sun Oil Company and other firms in the area, performing computations essential for interpreting seismic surveys and resource mapping. These applications highlighted the computer's capability for real-time data manipulation in industrial environments. Educationally, the CDC 1604 played a key role in training at university labs, supporting FORTRAN-based programs for curricula. At the University of Illinois, it underpinned the PLATO III system from the mid-1960s, delivering interactive lessons in and sciences to students via 20 terminals, fostering hands-on programming skills. Similarly, the integrated it into FORTRAN classes for classroom computing exercises, enhancing practical in and numerical methods. A notable case study involves weather modeling at the Fleet Numerical Weather Facility (FNWF), a precursor to NOAA's numerical prediction centers, where the CDC 1604 processed global meteorological data starting in 1961. Installed in , it executed approximately two billion computations per forecast cycle across 4,000 northern hemisphere grid points, leveraging 48-bit precision to generate accurate contoured predictions for sea height, , and wave patterns with resolutions down to 1/100th of an inch. This enabled hourly tailored forecasts for naval and civilian operations, establishing early foundations for modern atmospheric simulations.

Performance Milestones

Approximately 50 units of the CDC 1604 were produced and sold, underscoring its commercial success in the early era. The CDC 1604 demonstrated impressive computational speed for its era, achieving an average effective cycle time of 4.8 microseconds for random addresses in representative programs, equivalent to approximately 208,000 . Its basic parallel addition operation completed in 1.2 microseconds without access, contributing to overall performance that exceeded the 7090 in select floating-point tasks, particularly where the 48-bit word length reduced the need for double-precision arithmetic. A key reliability milestone came in 1960 with the initial production systems, which featured zero transistor failures during early deployment, marking a significant advancement in solid-state dependability over predecessors. The system's allowed upgrades to support real-time control applications, including simulations and weapons systems processing, where it handled high-speed data flows without interruption. In terms of , the CDC 1604 offered a power-to-performance ratio of about 22.5 kIPS/kW, representing a substantial improvement over vacuum tube-based machines that consumed far more for similar throughput.

Legacy and Comparisons

Variants and Successors

The CDC 924, introduced in 1962, served as a 24-bit variant of the 48-bit CDC 1604, offering a more compact design while maintaining compatibility with 1604 peripheral equipment. Its core provided storage starting at 8,192 words, with three 48-bit buffer output registers for compatibility with 1604 I/O devices and 12 lower-order bits aligned with smaller 160-series equipment. This configuration supported faster I/O operations, including 1.8 μsec access, making it suitable for real-time applications requiring efficient handling. The CDC 160A, a 12-bit standalone I/O processor released in 1962, evolved from the earlier CDC 160 to enhance processing in systems paired with the CDC 1604. It featured up to 32,768 words of core memory with a 6.4 μsec cycle time and an average instruction execution time of 15 μsec, enabling data exchange with peripherals at rates up to 70,000 words per second. In hybrid 1604-924 configurations, the 160A acted as a peripheral controller for devices like magnetic tapes and printers, supporting real-time biomedical and scientific computing tasks through buffered I/O and compatibility via microwave links at approximately 1,000,000 bits per second. The CDC 1604 design influenced the CDC 3000 series, introduced in May 1964 as a direct follow-on to both the 1604 and 924, expanding capabilities for medium-scale computing. This series included 24-bit lower models like the 3100 (8K–32K words) and 3300 (8K–262K words) for broader commercial use, alongside 48-bit upper models such as the 3600, which scaled memory addressing and performance while retaining upward compatibility with 1604 software and architecture. Over time, the lineage progressed toward 60-bit systems like the CDC 6600, building on the 1604's transistorized foundation for higher-speed scientific processing. Upgrade paths for CDC 1604 installations involved field modifications, such as transistor replacements to address early reliability issues in the all-transistor design and additions of I/O channels to support expanded peripherals without full system replacement. These enhancements allowed incremental scaling of memory banks and processing capacity, aligning with evolving real-time and data-intensive requirements.

Similar Contemporary Machines

The CDC 1604, a transistorized 48-bit scientific computer released in 1959, shared design parallels with contemporaries like the IBM 7090, which also targeted scientific computing but used a transistorized 36-bit architecture. While the IBM 7090 achieved higher instruction throughput at approximately 139 KIPS on a Gibson mix benchmark, the CDC 1604's longer word length enabled more efficient handling of double-precision arithmetic without additional overhead, compensating for its slightly lower 81 KIPS performance in similar tests. In contrast, the LARC, delivered in as a specialized large-scale system for the and , represented a more ambitious but cumbersome rival with dual processors and extensive drum storage totaling up to 6 million words. Priced at around $6 million and requiring a 3,000-square-foot room with 350 KVA power draw, the LARC's scale dwarfed the CDC 1604's compact footprint of two primary cabinets (measuring roughly 5 feet 8 inches high, 7 feet 5 inches wide, and 2 feet 3 inches deep for the main unit), making it far more expensive and power-intensive for similar scientific workloads. The LARC's addition times of about 4 microseconds supported roughly 250 KIPS, but its hybrid tube-transistor design and massive infrastructure limited commercial viability compared to the CDC 1604's efficient, all-transistor approach. The UK's Atlas, operational from 1962, offered a forward-looking alternative with transistorized logic and pioneering via a one-level store integrating core and storage, allowing seamless access to up to 1 million 48-bit words without the CDC 1604's reliance on fixed -only up to 32,768 words. Achieving around 700 KIPS through features like 128 index registers and rapid floating-point operations (e.g., 1.6 microseconds for addition), the Atlas emphasized multitasking and supervisor software for higher effective throughput, though its room-filling installation contrasted sharply with the CDC 1604's two-cabinet efficiency. This innovation highlighted a key design divergence, prioritizing expandability over the CDC 1604's simpler, more immediate-access architecture for defense and applications.

Historical Significance

The CDC 1604 marked a pivotal moment in history by demonstrating the commercial viability of fully transistorized computers, replacing unreliable vacuum tubes with more efficient solid-state components and enabling reliable, high-performance systems at a scale suitable for widespread adoption. Released in , it was among the first such machines to achieve significant market success, with each unit priced at approximately $1 million and proving attractive to and scientific users for its speed and reliability. This success helped transition the industry from first-generation vacuum-tube systems to second-generation -based architectures, laying foundational groundwork for the emergence of minicomputers and later supercomputers by showcasing scalable, cost-effective integration in large-scale . Seymour Cray's design of the represented a breakthrough that propelled (CDC) to the forefront of the computing industry, establishing the company as a leader in high-performance systems shortly after its founding in 1957. Cray's innovative architecture, which emphasized speed and efficiency through logic, not only secured CDC's first major $2.5 million contract with the U.S. Navy but also built the technical momentum for subsequent innovations, including the —the world's first true —and Cray's later founding of Research in 1972. This early triumph under Cray's leadership shifted CDC's focus toward supercomputing, influencing decades of advancements in vector processing and parallel architectures. The CDC 1604 accelerated the broader shift to second-generation worldwide, with over 50 units produced and installed by in military, government, and research settings across the and , underscoring its role in democratizing access to powerful . These installations facilitated processing and complex simulations, hastening the obsolescence of tube-based machines and inspiring competitors like to accelerate their own initiatives. By embodying the practical advantages of solid-state technology—such as reduced size, lower power consumption, and higher reliability—the 1604 helped define the parameters of modern . Surviving examples of the CDC 1604 are preserved in key institutions, symbolizing the dawn of the solid-state era and providing tangible insights into early transistorized design. Notable units are held by the in , where they illustrate CDC's pioneering contributions, while others remain in archival collections at sites like the former installation, ensuring the machine's legacy endures for historical study and education.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.