Hubbry Logo
search
logo
1720923

Apollo TV camera

logo
Community Hub0 Subscribers
Write something...
Be the first to start a discussion here.
Be the first to start a discussion here.
See all
Apollo TV camera

The Apollo program used several television cameras in its space missions in the late 1960s and 1970s; some of these Apollo TV cameras were also used on the later Skylab and Apollo–Soyuz Test Project missions. These cameras varied in design, with image quality improving significantly with each successive model. Two companies made these various camera systems: RCA and Westinghouse. Originally, these slow-scan television (SSTV) cameras, running at 10 frames per second (fps), produced only black-and-white pictures and first flew on the Apollo 7 mission in October 1968. A color camera – using a field-sequential color system – flew on the Apollo 10 mission in May 1969, and every mission after that. The color camera ran at the North American standard 30 fps. The cameras all used image pickup tubes that were initially fragile, as one was irreparably damaged during the live broadcast of the Apollo 12 mission's first moonwalk. Starting with the Apollo 15 mission, a more robust, damage-resistant camera was used on the lunar surface. All of these cameras required signal processing back on Earth to make the frame rate and color encoding compatible with analog broadcast television standards.

Starting with Apollo 7, a camera was carried on every Apollo command module (CM) except Apollo 9. For each lunar landing mission, a camera was also placed inside the Apollo Lunar Module (LM) descent stage's modularized equipment stowage assembly (MESA). Positioning the camera in the MESA made it possible to telecast the astronauts' first steps as they climbed down the LM's ladder at the start of a mission's first moonwalk/EVA. Afterwards, the camera would be detached from its mount in the MESA, mounted on a tripod and carried away from the LM to show the EVA's progress; or, mounted on a Lunar Roving Vehicle (LRV), where it could be remotely controlled from Mission Control on Earth.

NASA decided on initial specifications for TV on the Apollo command module (CM) in 1962. Both analog and digital transmission techniques were studied, but the early digital systems still used more bandwidth than an analog approach: 20 MHz for the digital system, compared to 500 kHz for the analog system. The video standard for the Block I CM meant that the analog video standard for early Apollo missions was set as follows: monochrome signal, with 320 active scan lines, and progressively scanned at 10 frames per second (fps). RCA was given the contract to manufacture such a camera. It was understood at the time that motion fidelity from such a slow-scan television system (SSTV) would be less than standard commercial television systems, but deemed sufficient considering that astronauts would not be moving quickly in orbit, or even on the Lunar surface.

Since the camera's scan rate was much lower than the approximately 30 fps for NTSC video, the television standard used in North America at the time, a real-time scan conversion was needed to be able to show its images on a regular TV set. NASA selected a scan converter manufactured by RCA to convert the black-and-white SSTV signals from the Apollo 7, 8, 9, and 11 missions.

When the Apollo TV camera transmitted its images, the ground stations received its raw unconverted SSTV signal and split it into two branches. One signal branch was sent unprocessed to a fourteen-track analog data tape recorder where it was recorded onto fourteen-inch diameter reels of one-inch-wide analog magnetic data tapes at 3.04 meters per second. The other raw SSTV signal branch was sent to the RCA scan converter where it would be processed into an NTSC broadcast television signal.

The conversion process started when the signal was sent to the RCA converter's high-quality 10-inch video monitor where a conventional RCA TK-22 television camera – using the NTSC broadcast standard of 525 scanned lines interlaced at 30 fps – merely re-photographed its screen. The monitor had persistent phosphors, that acted as a primitive framebuffer. An analog disk recorder, based on the Ampex HS-100 model, was used to record the first field from the camera. It then fed that field, and an appropriately time-delayed copy of the first field, to the NTSC field interlace switch (encoder). The combined original and copied fields created the first full 525-line interlaced frame and the signal was then sent to Houston. It repeated this sequence five more times, until the system imaged the next SSTV frame. It then repeated the whole process with each new frame downloaded from space in real time. In this way, the chain produced the extra 20 frames per second needed to produce flicker-free images to the world's television broadcasters.

This live conversion was crude compared to early 21st-century electronic digital conversion techniques. Image degradation was unavoidable with this system as the monitor and camera's optical limitations significantly lowered the original SSTV signal's contrast, brightness and resolution. The video seen on home television sets was further degraded by the very long and noisy analog transmission path. The converted signal was sent by satellite from the receiving ground stations to Houston, Texas. Then the network pool feed was sent by microwave relay to New York, where it was broadcast live to the United States and the world.

Apollo 7 and Apollo 8 used an RCA slow-scan, black-and-white camera. On Apollo 7, the camera could be fitted with either a wide angle 160-degree lens, or a telephoto lens with a 9-degree angle of view. The camera did not have a viewfinder or a monitor, so astronauts needed help from Mission Control when aiming the camera in telephoto mode.

See all
User Avatar
No comments yet.