Hubbry Logo
History of the railway trackHistory of the railway trackMain
Open search
History of the railway track
Community hub
History of the railway track
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
History of the railway track
History of the railway track
from Wikipedia

Section of timber track from a 16th-century gold mine in Transylvania. The wagons were guided by the pronounced flange on the wooden wheels, and the narrow gauge of 480 mm (18+78 in) allowed the points to be altered by swinging the single switch rail.[1]
Contemporary illustration of guided truck used in 16th-century mines in Germany
Reconstruction of flat wooden track for transporting silver ore; guidance was by a vertical pin running between the timbers

The railway track or permanent way is the elements of railway lines: generally the pairs of rails typically laid on the sleepers or ties embedded in ballast, intended to carry the ordinary trains of a railway. It is described as a permanent way because, in the earlier days of railway construction, contractors often laid a temporary track to transport spoil and materials about the site; when this work was substantially completed, the temporary track was taken up and the permanent way installed.

The earliest tracks consisted of wooden rails on transverse wooden sleepers, which helped maintain the spacing of the rails. Various developments followed, with cast iron plates laid on top of the wooden rails and later wrought iron plates or wrought iron angle plates (angle iron as L-shaped plate rails). Rails were also individually fixed to rows of stone blocks, without any cross ties to maintain correct separation. This system also led to problems, as the blocks could individually move. The first version of Isambard Kingdom Brunel's 7 ft (2,134 mm) broad gauge system used rails laid on longitudinal sleepers whose rail gauge and elevation were pinned down by being tied to piles (conceptually akin to a pile bridge), but this arrangement was expensive and Brunel soon replaced it with what became the classic broad gauge track, in which the piles were forgone and transoms, similar to sleepers, maintained the rail gauge. Today, most rail track uses the standard system of rail and sleepers; ladder track is used in a few applications.

Developments in manufacturing technologies has led to changes to the design, manufacture and installation of rails, sleepers and the means of attachments. Cast iron rails, 4 feet (1.2 m) long, began to be used in the 1790s and by 1820, 15-foot-long (4.6 m) wrought iron rails were in use. The first steel rails were made in 1857 and standard rail lengths increased over time from 30 to 60 feet (9.1–18.3 m). Rails were typically specified by units of weight per linear length and these also increased. Railway sleepers were traditionally made of Creosote-treated hardwoods and this continued through to modern times. Continuous welded rail was introduced into Britain in the mid 1960s and this was followed by the introduction of concrete sleepers.

Wooden tracked systems

[edit]

Plank ways

[edit]
Fragment of Lucas Gassel's 1544 painting The Coppermine depicting railway
Fragment of Lucas Gassel's 1544 painting The Coppermine depicting railway

The earliest use of a railway track seems to have been in connection with mining in Germany in the 12th century.[2] Mine passageways were usually wet and muddy, and moving barrows of ore along them was extremely difficult. Improvements were made by laying timber planks so that wheeled containers could be dragged along by manpower. By the 16th century, the difficulty of keeping the wagon running straight had been solved by having a pin going into a gap between the planks.[3] Georg Agricola describes box-shaped carts, called "dogs", about half as large again as a wheelbarrow, fitted with a blunt vertical pin and wooden rollers running on iron axles.[4] An Elizabethan era example of this has been discovered at Silvergill in Cumbria, England,[5] and they were probably also in use in the nearby Mines Royal of Grasmere, Newlands and Caldbeck.[6] Where space permitted round-section wooden tracks to take trucks with flanged wheels were installed: a painting from 1544 by the Flemish artist Lucas Gassel shows a coppermine with rails of this type emerging from an adit.[7][failed verification]

Edged rails

[edit]

A different system was developed in England, probably in the late 16th century, near Broseley for conveying coal from mines, sometimes drift mines down the side of the Severn Gorge to the River Severn. This, probably a rope-hauled incline plane, had existed 'long before' 1605.[8] This probably preceded the Wollaton Wagonway of 1604, which has hitherto been regarded as the first.[9][10]

In Shropshire, the gauge was usually narrow, to enable the wagons to be taken underground in drift mines. However, by far the greatest number of wagonways were near Newcastle upon Tyne, where a single wagon was hauled by a horse on a wagonway of about the modern standard gauge. These took coal from the pithead down to a staithe, where the coal was loaded into river boats called keels.[11]

Wear of the timber rails was a problem. They could be renewed by turning them over, but had to be regularly replaced. Sometimes, the rail was made in two parts, so that the top portion could easily be replaced when worn out. The rails were held together by wooden sleepers, covered with ballast to provide a surface for the horse to walk on.[citation needed]

Early iron rails

[edit]

Cast iron strips could be laid on top of timber rails, and the use of such materials probably occurred in 1738, but there are claims that this technology went back to 1716. [12] In 1767, Ketley ironworks began producing cast iron plates, which were fixed to the top of wooden rails with nails, to provide a more durable running surface. This construct was known as strap-iron rail (or strap rail) and was widely used on pre-steam railways in the United States.[13][14] Although relatively cheap and quick to build, they were unsuited to heavy loads and required 'excessive maintenance'. Train wheels rolling over the spikes loosened them, allowing the rail to break free and curve upwards sufficiently that a car wheel could get beneath it and force the end of the rail up through the floor of the car, writhing and twisting, endangering passengers. These broken rails became known as "snake heads".[14]

When wrought iron became available, wrought iron plates provided an even more durable surface. The rails had projecting lugs (or ears) with a hole to enable them to be fixed to the underlying wooden rail.[15][citation needed]

Iron plateways

[edit]
Section of L-shaped plate rails
A long fish bellied rail supported over several chairs

An alternative was developed by John Curr of Sheffield, the manager of the Duke of Norfolk's colliery there. This had a L-shaped rail, so that the flange was on the rail rather than on the wheel. This was also used by Benjamin Outram of Butterley Ironworks and William Jessop (who became a partner in them in 1790). These were used to transport goods for relatively short distances down to canals, though Curr's ran between the manor colliery and Sheffield town. These rails are referred to as plates, and the railway is sometimes called a plateway. The term "platelayer" also derives from this origin. In theory, the unflanged wheels could have been used on ordinary highways, but in practice this was probably rarely done, because the wagon wheels were so narrow that they would have dug into the road surface.

The system found wide adoption in Britain. Often, the plates were mounted on stone blocks, and sometimes without sleepers, but that was liable to cause the rails to spread apart, increasing the gauge. Railways of this kind were widely used in south Wales, particularly to transport limestone down to the ironworks, and then to take the iron to a canal, sometimes several miles away, which took the products to market. The rails were at first made of cast iron, typically in lengths of 3 feet (0.91 m), spanning between stone blocks.[16]

The stone blocks had been assumed to be permanent, but experience quickly showed that they settled and gradually moved under traffic, creating chaotic track geometry and causing derailments. Another problem was that the running surface was liable to become obstructed by stones, displaced from the ballast. An alternative was to use an iron tie bar to keep the rails to the proper gauge, incorporating a shoe in which the rail was fixed.[16]

An example of this was the Penydarren or Merthyr tramway. This was used by Richard Trevithick to demonstrate a pioneer locomotive in 1804, using one of his high pressure steam engines, but the engine was so heavy that it broke many of the rails.[citation needed]

Early edge rails

[edit]

Cast iron edge rails were used by Thomas Dadford Jr. when building the Beaufort and Blaenavon lines to the Monmouthshire canal in 1793. These were rectangular, 2.5 inches (64 mm) in width with a depth of 3 inches (76 mm) and 4 feet (1.2 m) in length, and required flanges on the wagon wheels. The same year, Benjamin Outram used edge rails on the Cromford Canal. T-shaped beams were used by William Jessop on the Loughborough-Nanpantan line in 1794, and his sons used I-shaped beams in 1813–15 on a railway from Grantham to Belvoir Castle. Samples of these rails are held in the Science Museum, London.[17]

A short-lived alternative was the fish-bellied profile, first used by Thomas Barnes (1765–1801) at Walker Colliery, near Newcastle in 1798, which enabled rails to have a longer span between blocks. These were T-section edge rails, three feet long and laid on transverse stone sleepers. These were still made of cast iron.[18]

Butt and lap joints

[edit]

The earliest rails had square butt joints, which were weak and difficult to keep in alignment. George Stephenson introduced lapped joints, which maintained their alignment quite well.[19][page needed]

Modern edge rails

[edit]

The breakthrough came when John Birkinshaw of Bedlington Ironworks in Northumberland developed rolled wrought iron rails in 1820 in 15 feet (4.6 m) lengths, as used for the Stockton and Darlington Railway. This was strong enough to bear the weight of a locomotive and of a train of wagons (or carriages) pulled by it. This marks the beginning of the modern rail era. This system was instantly successful, although some false starts took place. Some early rails were made in a T cross section, but the lack of metal at the foot limited the bending strength of the rail, which has to act as a beam between supports.

As metal technologies improved, these wrought iron rails were made progressively somewhat longer, and with a heavier, and therefore stronger, cross-section. By providing more metal in the foot of the rail, a stronger beam was created, achieving much better strength and stiffness, and a section was created similar to the bullhead rail section still visible today. This was expensive, however, and the promoters of early railways struggled with decisions about the appropriate weight (and therefore strength, and cost) of their rails.

At first, the rail section was almost symmetrical top-to-bottom, and was described as a double-headed rail. The intention was to invert the rail after the top surface had become worn, but rails tend to develop chair gall, an attrition of the rail where it is supported in the chairs, and this would have made running on the former bottom surface impossibly noisy and irregular. It was better to provide the extra metal on the top surface and gain extra wear there without the need to invert the rail at its half life.

Many railways preferred a flat bottom rail section, where the rails could be laid directly on the sleepers, representing a marked cost saving. Indenting of the sleeper was the problem; where the traffic was heavy, it became necessary to provide a sole plate under the rails to spread the load on the tie, partly vitiating the cost saving. However, in main line situations, this form found almost universal adoption in North America and Australia, and in much of continental Europe. The United Kingdom persisted with bullhead rail in main line use, with widespread introduction of flat-bottom rail only starting in about 1947.

Steel rails

[edit]

The first rails made from steel were made in 1857, when Robert Forester Mushet remelted scrap steel from an abortive Bessemer trial, in crucibles at Ebbw Vale ironworks, and were laid experimentally at Derby railway station on the Midland Railway in England. The rails proved far more durable than the iron rails they replaced and remained in use until 1873.[20][21] Henry Bessemer supplied 500 tons of steel blooms to the London and North Western Railway's rail mill at Crewe in 1860. Several other companies began producing steel rails in the following years.[22] The transition to steel rails was hastened by the introduction of open hearth steelmaking. William Siemens set up his Landore steelworks partly to supply rail to the Great Western Railway.[22] A boom in rail production followed, but a banking crisis in America slowed the rate at which railways were built there and orders to British rail producers.[23] The British iron and steel industry went into a recession, which particularly affected the wrought iron sector. When demand for rails began to grow again, it was largely for steel rails, which were more durable than those of iron.[citation needed]

Associated features

[edit]
NZR Half Kilometer peg, 70 lb/yd track and track fishplate. Weka Pass Railway

Sleepers

[edit]

Timber sleepers, that is transverse beams supporting the two rails that form the track, replaced the individual stone blocks formerly used. This system has the major advantage that maintenance adjustments to the track geometry did not disrupt the all-important track gauge. The alignment of the track could be adjusted by shifting the sleepers bodily, without loss of gauge. Softwood was widely used, but its life was limited if it was not treated with preservative, and some railways set up creosoting plants for the purpose. Creosote-treated hardwood is now widely used in North America and elsewhere.

By that time, relatively long (perhaps 20 ft or 6.1 m) wrought iron rails supported in chairs on timber cross-sleepers, were in use—a track form recognisable today in older track.

Steel sleepers were tried as an alternative to timber; Acworth[24] writing in 1889 describes the production of steel sleepers on the London & North Western Railway, and there is an illustration showing rolled channel section (shallow upturned "U" shapes) with no shaped ends, and with three-part forged chairs riveted direct. However steel sleepers seem not to have enjoyed widespread adoption until about 1995. Their dominant usage now is for life extension of existing track on secondary routes. They have a significant advantage on weak formations and poor ballast conditions, as the bearing area is at a high level, immediately under the rail seat.

Rail fastenings

[edit]

The early cast iron rails of the 18th century and before used integral fixings for nailing or bolting to the railroad ties. Strap rails introduced in the late 18th century, of cast and later rolled iron were nailed to wooden supports via countersunk holes in the metal. The introduction of rolled rail profiles in the 1820s such as the single flanged T parallel rail and later double flanged T parallel rail required the use of chairs, keys to hold the rail, and bolts or spikes to fix the chair. The flat bottomed rail invented by Robert L. Stevens in 1830 was initially spiked directly to wooden sleepers, later tie plates were used to spread the load and also keep the rail in gauge with inbuilt shoulders in the plate. Outside North America a wide variety of spring based fastening systems were later introduced in combination with baseplates and flat bottomed rail, these are now ubiquitous on main line high speed railways.

Ballast

[edit]

Track was originally laid directly on the ground, but this quickly proved unsatisfactory and some form of ballast was essential to ensure good drainage, spread the load and retain the track in position. The natural ground is rarely strong enough to accept the loading from locomotives without excessive settlement, more so in wet conditions; a layer of ballast under sleepers reduces the bearing pressure on the ground, tends to keep them in place and resist displacement, and keeps the permanent way well-drained.

Ballast in early days was usually locally available mineral product, such as gravel or reject material from coal and iron mining. The Great North of Scotland Railway used rounded river gravel, which does not constrain movement as much as sharp-edged stone. In later years slag, a by-product of steel making, and ash from steam locomotives was used. Modern practice is to employ sharp-edged stone crushed within a narrow size range.

Gauges

[edit]

Early track gauges

[edit]

The early railways were almost exclusively local concerns involved with conveying minerals to some waterway; for them the gauge of the track was adopted to suit the wagons intended to be used, and it was typically in the range 4 ft (1,200 mm) to 4 ft 8+12 in (1,435 mm), and at first there was no idea of the need for any conformity with the gauge of other lines. When the first public railways developed, George Stephenson's skilful innovation meant that his railways were dominant and the 4 ft 8+12 in (1,435 mm) gauge he used was therefore the most widespread. As early notions of linking up different railway systems evolved, this gauge secured general adoption. It is more or less an accident of history that this gauge—which suited the wagons already in use at the colliery where George Stephenson had been an engine man—became the British standard gauge: it was exported to most of Europe and North America.

Reference is sometimes made to the "gauge" of ruts in stone roadways at ancient sites such as Pompeii, and these are often asserted to be about the same as Stephenson's gauge. Of course the ruts were made by the wheels of carts, and the carts were of a sensible size for horse-drawn carts prior to the industrial era, pretty much the same as the size of the pre-railway carts at the colliery where Stephenson worked: that is the only connection.

Broad gauge track

[edit]

When Isambard Kingdom Brunel conceived the Great Western Railway (GWR), he sought an improved design for his railway track and accepted none of the previous received wisdom without challenge. The 4 ft 8+12 in (1.435 m) gauge had been fine for small mineral trucks on a horse-drawn tramway, but he wanted something more stable for his high speed railway. The large diameter wheels used in stage coaches gave better ride quality over rough ground, and Brunel originally intended to have his passenger carriages carried in the same way—on large diameter wheels placed outside the bodies of the carriages. To achieve this he needed a wider track gauge and he settled on the famous 7 feet (2.1 m) broad gauge. (It was later eased to 7 ft 0+14 in or 2.140 m). When the time came to build the passenger carriages, they were designed conventionally with smaller wheels under the bodies after all, but with a seven-foot track gauge the bodies could be much wider than on the standard gauge. His original intention to have the wheels outside the width of the bodies was abandoned.

Brunel also looked at novel track forms, and decided to use a continuously supported rail. Using longitudinal timbers under each rail, he achieved a smoother profile while not requiring such a strong rail section, and he used a shallow bridge rail for the purpose. The wider, flat foot also meant that the chair needed by the bullhead section could be dispensed with. The longitudinal timbers needed to be kept at the proper spacing to retain the gauge correctly, and Brunel achieved this by using timber transoms—transverse spacers—and iron tie-bars. The whole assembly was referred to as the baulk road—railwaymen usually call their track a road. Initially, Brunel had the track tied down to timber piles to prevent lateral movement and bounce, but he had overlooked the fact that the made ground, on which his track was supported between piles, would settle. The piles remained stable and the ground between them settled so that his track soon had an unpleasant undulation, and he had to have the piles severed, so that the track could settle more or less uniformly. A variant of the baulk road can still be seen today on many older under-bridges where no ballast was provided. The design varies considerably, but in many cases longitudinal timbers are supported directly on the cross-girders, with transoms and tiebars to retain the gauge, but of course with modern rails and base-plates or chairs. The longitudinal sleepers are somewhat similar to modern-day Ladder track.

The group of railways that had Brunel as their engineer were successful and the broad gauge track spread throughout the west of England, South Wales, and the West Midlands. But, as the British railway network spread, the incompatibility of the two systems became a serious blockage, as a wagon could not be sent from one system to the other without transshipping the goods by hand. A Gauge Commission was appointed to determine national policy. The Broad Gauge was technically superior but conversion of the standard gauge routes to broad would have meant reconstructing every tunnel, bridge and station platform, whereas universal adoption of the standard gauge only required the progressive conversion of the track itself. The broad gauge was doomed, and no further independent broad gauge lines could be built.

The existing broad gauge routes could continue, but as they had no development potential it was only a matter of time before they were eventually converted to standard. In the meantime, an extensive mileage of mixed gauge track was installed, where each line had three rails to accommodate trains of either gauge. There were some instances of mixed gauge trains being run, where wagons of each gauge were run in a single train. The legacy of the broad gauge can still be seen where there seems to be an unnecessarily wide space between station platforms.

Twentieth century and beyond

[edit]

1900 to 1945

[edit]
World's railways in 1908 as presented by The Harmsworth atlas and Gazetter

At the beginning of the twentieth century, the form of British track had converged on the use of wrought iron bullhead rails supported in cast iron chairs on timber sleepers, laid in some form of ballast. In North America, the standard was T-rails and tie plates fastened to timber crossties with cut spikes. Many railways were using very light rails and, as locomotive weights and speeds increased, these became inadequate. Consequently, on main lines, the rails in use were made progressively heavier (and stronger). Metallurgical processes improved and better rails, including some of steel, came into use. From a maintenance point of view, the rail joints were the source of most of the work, and as steel-making techniques improved it became possible to roll steel rails of increased length—reducing the number of joints per mile. The standard length became 30 ft (9,144 mm), then 45 ft (13,720 mm) and finally 60 ft (18,290 mm) rails became the norm. For main line use, the standard rail section became the 95BH section, weighing 95 lb/yd (47 kg/m). For secondary routes, a lighter 85BH, 85 lb/yd (42 kg/m), section was used.

Flat bottom rails were still seen as undesirable for British main line railway use, despite their successful use in North America, although some lightly operated British railways used them, generally spiked direct to the sleepers. Under heavy usage, they indent the sleepers severely and the incremental cost of a base-plate appeared at this early date, to rule the flat bottom section out.

Timber sleepers were expensive and not durable, and the railways’ engineers had strong—and conflicting—views about the best wood species and the best preservative treatments. The railways moved towards standardisation on a softwood sleeper preserved by pressure injection of creosote, measuring 8 ft 6 in (2.59 m) long by 10 in (250 mm) by 5 in (130 mm). Chairs were secured to the sleepers by trenails (steel spikes driven through a timber sleeve) or three chair-screws on first class routes. The GWR alone among the main line railways kept to its own standard, the 00 rail at 97.5 lb/yd (48.4 kg/m), and with two fangbolts securing each chair to the sleeper, with the head of the bolt under the sleeper and a nut above the chair—more secure but much more difficult to adjust.

Some experiments were made before 1945 with reinforced concrete sleepers, in most cases with bullhead chairs mounted on them. This was in response to the very high price of the best (most durable) timber, but reinforced concrete sleepers were never successful in main line use. Concrete pots were also used in sidings; they are sometimes called twin-block sleepers, and consisted of two concrete blocks each mounted with a chair, and an angle iron connecting them and retaining the gauge.

Post-war developments

[edit]

At the end of the Second World War in 1945, the British railways were worn out, having been patched up following war damage without the availability of much new material. The country was economically in a weak situation also, and for nearly a decade after the war, materials—especially steel and timber—were in very short supply. Labour too was seriously restricted in availability.

The railway companies became persuaded that the traditional bullhead forms of track needed revision, and after some experimentation a new flat bottom rail format was adopted. The British Standard sections were unsuitable and a new profile, a 109 lb/yd (54 kg/m) rail, was made the new standard. In 60 ft (18 m) lengths, laid on steel baseplates on softwood sleepers, it was to be the universal standard. The fastenings were to be of a resilient steel type, and for secondary routes a 98 lb/yd rail was adopted. Regional variations still persisted, and hardwood sleepers and Mills clip fastenings were favoured on the Eastern Region, for example.

The new designs were successful, but they introduced many challenges, especially as the availability of experienced track maintenance staff became acutely difficult, and poorly maintained flat bottom track seemed more difficult to keep in good order than poorly maintained bullhead track. The greater stiffness of flat-bottom was an advantage, but it tended to straighten out between the joints on curves; and flat bottom’s rigidity led to high vertical impact forces at badly maintained joints and this resulted in high volumes of fatigue fractures at the joints. Moreover, the elastic rail fastenings had little resistance to rail creep—the propensity of the rails to move gradually in the direction of traffic, and the workload of pulling back the rails to regulate the joints was surprisingly high.

Long welded rails

[edit]

Much of the work of maintaining the track was at the joints, especially as the stiff rails became dipped, and the joint sleepers took a hammering. Pre-war experiments with long welded rail lengths were built upon, and in the years from 1960 long rail lengths were installed, at first on hardwood sleepers but soon on concrete sleepers. For example, the first long welded rail (almost 1 mi or 1.6 km) on the UK's East Coast Main Line was laid in 1957, just south of Carlton-on-Trent, resting on rubber pads to resist rail creep.[25] In this pioneering stage, some catastrophic mistakes in detailed design were made, but from about 1968 continuous welded rail became a reliable standard for universal installation on main and secondary routes. The form adopted used pre-stressed concrete sleepers and a 110A rail section—a slight improvement on the 109 rails previously used—the A was to distinguish it from the British Standard 110 lb/yd (55 kg/m) rail section, which was unsuitable. Rail fastenings eventually converged onto a proprietary spring clip made by the Pandrol company which was the exclusive form of fastening in Britain for about 30 years.

The welded track was to be laid on 6 to 12 inches (15 to 30 cm) of crushed stone ballast, although this was not always achieved, and the bearing capacity of the formation was not always taken into account, leading to some spectacular formation failures.

A further enhancement to the rail profile produced the 113A section, which was the universal standard until about 1998; detail improvements to the sleepers and ballast profile completed the picture and the general form of the track had stabilised. This format is now in place over 99% of the first-class main lines in Britain, although the CEN60 (60 kg/m) rail section was introduced in the UK during the 1990s. This has a wider rail foot and is taller than the 113A section so is incompatible with standard sleepers.

Track renewal trains have now replaced labour-intensive permanent way gangs. Long welded rail was hard to install manually. An early demonstration of mechanised track-laying with two 600 ft (180 m) lengths of long welded rail took place on the Fighting Cocks branch in 1958. The two lengths were loaded on ten wagons, attached to the existing track by a steel rope and drawn back at 30 ft/min (9.1 m/min). As the train moved back, the old rails were levered out and the new ones dropped into the chairs. A hoist on the rear wagon dropped the last part of the rail into place.[26]

Track gauge

[edit]

As stated, the general track gauge in Britain was 4 ft 8+12 in (1,435 mm). In the later 1950s, general track maintenance standards deteriorated rapidly due to labour shortages and, on some routes, faster freight train speeds. Freight trains consisted almost entirely of short wheelbase (10 ft or 3.0 m) four-wheeled wagons carried on a very stiff elliptical leaf spring suspension, and these wagons showed a rapid rate of increase in derailments.[citation needed]

In response to the dynamic behaviour ("hunting") of the wagons, the permitted speed of the wagons was reduced to 45 mph (72 km/h) and, on new installations of continuously welded track on concrete sleepers, to reduce the track gauge by one-eighth of an inch to 4 ft 8+38 in (1,432 mm).[citation needed] In practice this change caused more problems than it cured and it was returned on renewals to 1435 mm with effect from 1996. The gauge is set by the positioning of the cast-in fixings, so it is not a simple task to re-gauge existing track; it also creates problems with spot replacement of sleepers. Many sleepers were made with the reduced track gauge but 1,435 mm (4 ft 8+12 in) standard gauge versions have also been manufactured in more recent times.[27]

Switches and crossings

[edit]
Railway turnouts

Terminology is difficult for "switches and crossings" (S&C) previously "points and crossings", or "fittings".

Early S&C allowed only a very slow speed on the subsidiary route (the "turnout"), so geometrical design was not too important. Many older S&C units had a loose joint at the heel so that the switch rail could turn to close to the stock rail or open from it. When the switch rail was closed, a reasonable alignment was secured; when it was open, no wheel could run on it so it did not matter.

As speeds rose, this was no longer feasible and the switch rails were fixed at the heel end, and their flexibility enabled the toe end to open and close. Manufacture of the switch rails was a complex process, and that of the crossings even more so. Speeds on the subsidiary route were rarely higher than 20 mph (32 km/h) except in very special designs, and great ingenuity was employed to give a good ride to vehicles passing through at speed on the main line. A difficulty was the common crossing where continuous support to wheels passing was difficult, and the point rail was planed to protect it from direct impact in the facing direction, so that a designed irregularity in support was introduced.

As faster speeds were required, more configurations of S&C were designed, and a very large number of components, each specific to only one type of S&C, was required. At faster speeds on the turnout road, the divergence from the main route is much more gradual, and therefore a very considerable length of planning of the switch rail is required.

About 1971, this trend was reversed with the so-called vertical S&C, in which the rails were held vertical, rather than at the customary 1 in 20 inclination. With other simplifications, this considerably reduced the stockholding required for a wide range of S&C speeds, although the vertical rail imposes a loss of the steering effect and the ride through new vertical S&C is often irregular.

Continuous welded rail

[edit]
Continuous welded track with conductor rail installed in the 1970s

Continuous Welded Rail (CWR) was developed in response to the observation that the bulk of track maintenance work takes place at the joints. As steel production and manufacturing processes improved, the rail lengths installed were progressively increased, and the logical extension of this would be to eliminate the joints altogether.

A major obstacle to doing so is thermal expansion: the rails expand in higher temperatures. Without joints, there is no room for the rails to expand; as the rails get warmer, they will develop an enormous force in trying to expand. If prevented from expanding, they develop a force of 1.7 tonnes (17 kN) for every 1 degree Celsius of temperature change in a practical rail section.[28]

If a small cube of metal is compressed between the jaws of a press, it will contract—that is it will be squashed somewhat—and a very large force can be resisted by it without ultimate failure. However, if a long piece of metal of the same cross section is compressed, it will deform sideways into a bow shape; the process is called buckling, and the compressive force it can withstand is very much less.

If the long thin piece of metal could be constrained to prevent it from buckling (e.g. by being contained inside a tube) then it can resist a much higher compressive force. If the rails can be constrained in a similar way, they can be prevented from buckling. The weight of the track resists buckling upwards, so buckling is most likely to take place laterally. This is prevented by:

  • providing heavy sleepers, that generate friction on the ballast bed
  • ensuring that the sleepers are well supported on consolidated ballast to enable the generation of the friction
  • providing consolidated ballast around the sides of the sleepers to provide additional friction
  • heating the rails when they are installed and fastened in cool or cold weather, so that the expansion on the hottest days is less than otherwise
  • making sure that any rail added if rail breaks during cold weather is removed before warm weather returns.
  • making sure that curves do not line themselves inward during cold weather sufficiently to make buckling more likely when warm weather returns
  • taking precautions when track maintenance work is performed in hot weather, and making sure ballast is sufficiently consolidated before full-speed operation is resumed.

If the rail is held so that it cannot expand at all, then there is no limit on the length of rail that can be handled. (The expansive force in a one-foot length of rail at a certain temperature is the same as in a mile or 100 mile length of rail.) Early continuous welded rail was installed in limited lengths only because of technological limitations. However at the end of the CWR section where it abutted older, ordinary jointed track, that track would be unable to resist the expansive force and the jointed track might be forced to buckle. To prevent that, special expansion switches, sometimes called breathers, were installed. The expansion switches could accommodate a considerable expansive movement—typically 4 in (100 mm) or so—in the end section of the CWR without passing the movement on to the jointed track.

The CWR is installed and fastened down at an optimum temperature, to ensure that the highest possible expansive force is limited. This temperature is called the stress-free temperature, and in the UK it is 27 °C (81 °F).[28] It is in the upper range of ordinary outdoor temperatures, and the actual installation work tends to be done at cooler temperatures. Originally the rails were physically heated to the stress-free temperature with propane gas heaters; they were then rattled with hand bars to eliminate any binding, preventing even expansion, and then clipped down. Since about 1963 however hydraulic jacks are used to physically stretch the rails while they are supported on temporary rollers. By stretching the rails to the length they would be if they were at the stress-free temperature, then there is no need to heat them; they can just be clipped down before the jacks are released.

The CWR rails are made by welding ordinary rails together. For many years, rails could only be made in lengths of up to 60 ft (18 m) (18.288 m) in Britain, and the factory welding process made them into 600 ft (180 m), 900 ft (270 m), or 1,200 ft (370 m) lengths, depending on the factory. The process used was a flash-butt process in which high electrical currents are used to soften the rail end, and the ends are then forced together by rams. The flash-butt process is very reliable, providing that the factory ensured good geometry of the rail ends.

The long rails could be conveyed to site by a special train, and unloaded on to the ground (by chaining the end in position and pulling the train out from underneath the rails). The long rails had to be welded together (or to adjacent track) using a site welding process; and, after initial experimentation, the proprietary Thermit welding process was used. This was an alumino-thermic process in which a powder 'portion' was ignited; the aluminium was the fuel and a metallurgically appropriate composition of molten steel descended into the gap between the rail ends, contained in refractory moulds.

The original SmW process was very sensitive to operator skill, and as the welding was usually the final process before returning the track to traffic, time pressure was sometimes applied resulting in unwanted improper welds. The improved SkV process was less sensitive and over the years weld quality improved.[29]

The issue of buckling is not restricted to CWR, and jointed track has suffered buckles in the past. The fish-plates at joints need to be removed and greased annually (the requirement was relaxed to bi-annually in 1993) and where this was omitted or where ballast conditions were especially weak, buckling took place in hot weather. In addition, if rails were allowed to creep, it was always possible that several successive joints might close up, so that the expansion gap was lost, with inevitable results at the onset of hot weather.

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The history of the railway track traces the technological progression from rudimentary wooden pathways used in operations during the 15th and 16th centuries to advanced rail systems supported by sleepers in the contemporary era, fundamentally enabling the global expansion of for freight and passengers. Early developments originated in European mines, where wooden rails facilitated the movement of wheeled tubs drawn by horses, with the first documented wagonways appearing in around 1550 and spreading to Britain by the through German miners. By the early , these wooden tracks were widespread in English coalfields for transporting coal to rivers and ports, often spanning several miles, as exemplified by the bridge constructed in 1725–1726, the oldest surviving railway structure. Innovations in the late addressed durability issues: rails were introduced in 1767 by the Company in , laid atop wooden beams to prevent wear, while flanged wheels, patented by William Jessop in 1789, improved stability by providing a groove for better rail grip. The advent of steam locomotion in the early accelerated track evolution, with rails developed by around 1820 offering greater strength and longevity compared to , which was prone to brittleness. These advancements underpinned the opening of the in 1825, the world's first public steam-powered railway, and the in 1830, which demonstrated intercity passenger viability. By the mid-19th century, the transition to steel rails began in 1857 when British metallurgist Robert Mushet produced the first durable steel rails by improving the through the addition of alloys such as and carbon, revolutionizing track resilience and reducing maintenance needs amid expanding networks. In the , further refinements included the first introduction of concrete sleepers in for enhanced load distribution and longevity over traditional wooden ties, with widespread adoption in the mid-20th century; prestressed mono-block designs emerged in 1943 to support high-speed and heavy-haul operations. Continuous welded rail , introduced in the 1930s, eliminated joints for smoother rides and reduced wear, while modern tracks incorporate advanced alloys, automated monitoring, and standardized gauges like the 1,435 mm (4 ft 8½ in) Stephenson gauge to ensure interoperability across global systems. These developments have sustained railways as a cornerstone of efficient, .

Ancient and Pre-Industrial Precursors

Rutways and Early Guided Paths

Rutways, consisting of parallel grooves carved into stone surfaces by the repeated passage of wheeled vehicles, represent some of the earliest known guided transport paths in ancient civilizations. These features emerged as precursors to modern railway tracks by constraining and reducing lateral movement on uneven terrain, facilitating more efficient cart travel without the need for raised rails. Dating back to the , rutways were typically formed unintentionally through wear but served to guide ox-drawn or horse-pulled carts along predefined routes. In , rutways appeared on stone-paved roads constructed between approximately 300 BCE and 400 CE, where deep grooves—often 5 to 10 inches in depth—were worn into the or lava stone by iron-tyred and cart wheels. These ruts, spaced to match standard Roman cart axle widths of about 1.4 meters, helped maintain straight-line travel on narrow, single-lane streets elevated for drainage. Archaeological evidence from sites across the empire, including repairs using molten iron to fill worn grooves, underscores their role in supporting heavy urban traffic. A prominent example comes from Pompeii, where excavations reveal over 600 instances of parallel wheel ruts etched into street pavements and crossing stones, dating from the first century BCE to the city's destruction in 79 CE. These grooves, formed by generations of vehicular use including plaustrum wagons and cisium chariots, preserved directional patterns of and indicate a sophisticated urban system with high kerbs and pedestrian stepping stones to avoid the channels. The ruts not only guided wheels but also minimized friction on the durable stone surfaces, enabling smoother passage for loaded carts in a bustling port city. Similar grooved stone paths appear in early Chinese contexts, such as the Jingxing Road in Province, built in the third century BCE during the and used continuously until the mid-20th century. This narrow artery, connecting and , features parallel grooves worn by ox-carts and carriages with standardized wheel spacings, which workers periodically flattened to maintain usability. The ruts guided transport across mountainous terrain, reducing slippage and wear on vehicle axles while supporting and movement. In ancient , rare examples of cart tracks are evident in the planned streets of Indus Valley Civilization sites like and , dating to 2600–1900 BCE. These grooves in baked-brick or stone pavements accommodated solid-wheeled carts pulled by oxen or humped bulls, as depicted in contemporary terracotta models, and helped align vehicles within the grid-like urban layouts. Such features lowered on compacted surfaces, promoting efficient short-haul transport of goods like and grains. Overall, these rutways across Roman, Chinese, and Indian civilizations demonstrated how natural or semi-intentional grooves could guide wheeled vehicles, decrease friction compared to unpaved earth, and enhance stability without mechanical or elevated rails. By constraining wheel paths, they foreshadowed the development of more formalized wooden wagonways in medieval .

Medieval Wagonways

The emergence of medieval wagonways in , particularly in German mining regions, dates back to around the 1550s, where wooden rails were employed to facilitate the of from mines. These early systems, known as Reisen wagonways, consisted of simple parallel wooden rails laid on the ground, along which horse-drawn or pushed mine carts traveled. The carts featured flanged wheels that engaged the inner edges of the rails to maintain guidance and prevent , a that marked a significant advancement over earlier unguided paths within mines. The technology was introduced to in the late by German miners, with an early underground example at the Mines Royal near Keswick (Caldbeck) around 1568, and adapted for overground coal extraction by the early . A prominent example is the Wollaton Wagonway, constructed between October 1603 and October 1604 by Beaumont in partnership with Sir Percival Willoughby near . This approximately 2-mile (3.2 km) overland track connected coal pits at Strelley to a distribution point at Wollaton, utilizing wooden rails and horse-drawn wagons to haul coal efficiently to nearby roads and the River Trent for further transport by . The system operated successfully for at least a decade, demonstrating the practicality of railed transport in industrial settings. Compared to unpaved paths, these wooden wagonways offered a key advantage through reduced rolling resistance, enabling a single horse to pull substantially heavier loads—up to several tons on rails versus a fraction of a ton on soft roads—thus improving efficiency in mining operations. This efficiency contributed to their adoption in extractive industries across Europe.

Wooden Track Systems

Flat Wooden Rail Wagonways

Flat wooden rail wagonways represented an early form of guided track system in the burgeoning industrial landscape of 18th-century , primarily developed to facilitate the efficient transport of from collieries to rivers or canals using horse-drawn wagons. These systems utilized flat wooden planks laid end-to-end to form parallel rails, providing a smoother surface than unpaved paths and reducing for wheeled vehicles. Originating in the coal-rich regions of northeast , such as around Newcastle and Durham, they evolved from rudimentary 17th-century and proliferated during the 1700s as mining output surged, with similar flat-plank configurations appearing in European collieries in areas like Germany's region by the mid-18th century. The earliest known example is the Wollaton of 1604 near , spanning about 3 km for horse-drawn transport. Construction typically involved sturdy timber planks, often or , measuring 6 to 8 feet in length and roughly 4 to 6 inches thick, placed flat and butted together to create continuous rails. These planks were secured by wooden pegs or spikes to transverse sleepers—crossbeams of wood laid at intervals of 2 to 3 feet—to elevate the track slightly above the ground and distribute weight. The wagons, featuring plain iron-tyred wheels that rolled directly on the flat surface, were guided by a wooden peg fitting into a groove between the parallel rails, allowing one to haul loads of up to 2 to 3 tons over distances of several miles. Rails were typically made from durable hardwoods such as , , or , sometimes sourced from recycled timber. A notable example is the 1767 plateway at in , a hybrid system where wooden planks or sleepers supported early cast-iron plates, marking an incremental improvement in durability while retaining wooden elements for cost efficiency in the iron-producing district. Despite their practicality, flat wooden rail wagonways suffered significant limitations, including rapid wear from the constant pressure of wagon wheels and horse hooves, which caused splintering and uneven surfaces over time. Wood rot accelerated deterioration in damp colliery environments, necessitating frequent replacements—sometimes every few years—and high maintenance costs that strained operators. from overloaded wagons or adverse further exacerbated splitting and misalignment, limiting these systems to low-speed, short-haul operations and prompting eventual transitions to more robust edged wooden rails for enhanced guidance and longevity.

L-Section and Edged Wooden Rails

In the mid-18th century, particularly around the 1760s in , L-shaped wooden rails emerged as an innovative profiled track system designed to guide wagons more effectively on early wagonways. These rails, constructed from timber such as or , featured a vertical on the inner side to guide plain-wheeled wagons by containing the wheel treads, preventing derailments and representing a transitional from flat rail wagonways toward more specialized rail profiles. Primarily used in regions like and Durham, they facilitated horse-drawn transport over short distances to rivers or staiths, improving efficiency in industrial operations. Edged wooden rails, prevalent from the early 17th century but refined in the 18th century, consisted of timber rails with a raised upper edge for wheel contact and a stabilizing lower section or flange to anchor them to sleepers. This configuration allowed flanged wooden wheels to run with reduced friction and better alignment, enabling heavier loads—up to several tons per wagon—compared to unprofiled planks. Archaeological evidence from sites like the Willington Waggonway demonstrates their double-rail layout on transverse sleepers spaced 2-3 feet apart, underscoring their role in standardizing early guided paths for mineral haulage. Rails were typically made from durable hardwoods such as , , or , sometimes sourced from recycled timber. A prominent application occurred in Richard Trevithick's 1805 experiments at colliery, where high-pressure operated on wooden-edged rails laid to a 5-foot gauge, testing and traction on timber tracks before transitioning to iron. Despite these advances, wooden rails suffered from rapid wear due to abrasion from flanged wheels, moisture-induced rot, and exposure to industrial pollutants, necessitating frequent replacements every 1-2 years to maintain safe operations. In the early 19th century, such rails were sometimes reinforced with iron straps nailed along the top to mitigate wear and extend usability.

Introduction of Iron Rails

Iron Plateways

Iron plateways represented an early innovation in railway track construction, where plates were affixed to wooden rails to provide a more durable running surface for horse-drawn wagons, primarily in industrial settings such as collieries. This development addressed the rapid wear of wooden tracks under heavy loads, extending their while maintaining the basic structure of pre-industrial wagonways. The plates allowed for smoother wheel movement and reduced , facilitating the transport of , , and other goods over short distances. The initial production of plates for railways occurred in at the Iron Works in , , where plates measuring approximately 3 feet (0.91 m) in length and 4 inches (10 cm) in width, featuring projecting lugs for attachment, were cast and fixed to the tops of wooden rails using nails. These early plates were flat or semi-d and marked a shift from purely wooden systems, though they still required wooden support. A significant advancement came around 1788 when John Curr, manager of the Park Colliery, introduced L-shaped plates with an upright to guide plain wheels, designed specifically for underground mine use but soon adopted on surface lines; each plate was about 3 feet long and spiked or bolted to wooden stringers laid on the ground. By the late , iron plateways had become widespread in British collieries, enabling efficient in operations across the North East and . A notable example of a public iron plateway was the , authorized by in 1801 and opened in 1803, extending 8.5 miles from to in (now ). Engineered by William Jessop, this horse-drawn line used L-shaped plates, typically 4 to 6 feet (1.2 to 1.8 m) long, bolted to wooden rails supported by stone blocks; wagons carried loads of about 3 tons each, with a single pulling four to six wagons at speeds up to 4 mph. The design supported total loads exceeding 20 tons per train, significantly boosting goods transport for local industries like milling and . Despite their advantages, iron plateways had inherent limitations due to the material properties of . The brittle nature of caused plates to crack or shatter under the repeated impact of , especially on uneven sections or during overloads, leading to frequent derailments and accidents in colliery operations. These failures, often exacerbated by climatic variations and poor jointing, highlighted the need for stronger materials, paving the way for the adoption of edge rails suitable for emerging technologies.

Flanged and Edge Rails

The transition to flanged and edge rails in the early represented a pivotal in design, particularly after , as steam-powered locomotives demanded more robust capable of withstanding higher loads and speeds. Unlike earlier plate rails that relied on flanges on the rail itself to guide unflanged wheels, edge rails featured a raised, narrow tread surface that shifted the guidance function to flanges on the wheels, improving stability and reducing wear on the track. This design, initially experimented with in during the late 18th century, gained prominence post-1810 through the use of , which provided greater tensile strength and resistance to brittle fracture, enabling longer rail sections and smoother operation under steam traction. A landmark innovation came in 1820 when John Birkinshaw, manager of the Bedlington Ironworks in , , patented a process for rolling wrought-iron edge rails, marking the first practical method to produce them in extended lengths from . These rails were typically manufactured in 15-foot lengths—significantly longer than the 3- to 4-foot cast-iron predecessors—allowing for fewer joints and a more continuous track surface. The cross-sectional profile resembled an early T-shape, with a widened head for contact, a vertical web for structural support, and a broad base to distribute load onto the underlying sleepers or blocks, often incorporating a fish-bellied curve along the length to enhance rigidity. Weighing around 28 pounds per yard, this design used less material than equivalent cast-iron rails while offering comparable or superior strength due to wrought iron's . Birkinshaw's rails were prominently adopted on the , engineered by and opened in 1825 as the first public railway to rely primarily on for both and freight service. Approximately two-thirds of the line's 26-mile track utilized these 15-foot wrought-iron edge rails, supplied by Bedlington Ironworks, totaling over 900 tons by 1823. This implementation demonstrated the rails' practical advantages: their malleability allowed them to work-harden under repeated wheel pressure, resisting rust, cracking, and deformation better than , while costing roughly twice as much initially but requiring minimal over years of service. The adoption of flanged wheels on edge rails facilitated steam traction speeds reaching up to 15 miles per hour on the Stockton and line during its inaugural runs, a marked improvement over horse-drawn systems, without the frequent track breaks that plagued cast-iron alternatives under locomotive forces. This durability supported the railway's operational reliability, carrying and passengers at averages of 8 miles per hour while peaking at 15 miles per hour on descents, establishing edge rails as the standard for emerging steam railways. Jointing challenges with these longer sections were later mitigated through specialized techniques.

Early Jointing Techniques

In the early , connecting segments of iron rails posed significant challenges for maintaining track alignment and smoothness, as rails were typically 12 to 15 feet long due to manufacturing limitations. The most basic method was the , where rail ends were squared off and placed end-to-end, relying on external supports to prevent misalignment and excessive wear from passing wheels. These joints were initially held by specialized cast-iron chairs positioned at the ends, which cradled both rail segments and were spiked or wedged into stone blocks or timber sleepers to secure the connection. To address the abruptness of butt joints, which caused jolts and accelerated deterioration, lap joints emerged as an improvement around 1816. Patented by (Patent No. 4067), half-lap joints involved overlapping the rail ends by a short distance, typically 6 to 9 inches, creating a staggered connection that provided a smoother transition for wheels and better resistance to vertical movement. These were secured similarly with chairs at the overlap, often redesigned with curved bases to accommodate the joint's profile, and were employed on early lines like the in 1825. Despite these advances, both butt and lap joints suffered from early failures, primarily due to and contraction; temperature fluctuations caused rails to buckle or loosen in their fastenings, while vibrations from traffic exacerbated splitting in stone blocks or swelling in wooden wedges. A notable example of implementation occurred on the , opened in 1830, where 15-foot fish-bellied wrought-iron rails weighing 35 pounds per yard were butted together and supported by joint chairs—each rail resting in seven such chairs on stone blocks or oak sleepers. These chairs, spiked into place, aimed to distribute load at the joints but proved inadequate for heavier traffic, leading to replacements by 1832. By the , fishplates—flat iron bars bolted across the rail webs on either side of the joint—began supplementing or replacing chairs for butt joints, offering greater flexibility and reducing maintenance needs, though widespread adoption followed into the 1850s. In the transition to steel rails later in the century, these techniques evolved toward longer sections and more robust supports, minimizing joint-related issues.

Evolution to Steel Rails

Initial Steel Rail Production

The transition to steel rails marked a significant advancement in railway track durability during the mid-19th century, as brittle rails frequently fractured under heavy loads, leading to frequent replacements and safety concerns. In 1857, British metallurgist Robert Mushet produced the world's first durable rails at his Dark Hill Iron Works in , , utilizing an enhanced version of the to convert into high-quality suitable for rail applications. The , invented by in 1856, involved blowing air through molten to oxidize and remove carbon and impurities, but Mushet's key innovation—adding spiegeleisen, an alloy of iron, manganese, and carbon—restored essential properties like hardness and toughness that were lost during , enabling the production of rails with superior wear resistance. Adoption of steel rails spread to the in the 1860s amid growing rail traffic during post-Civil War expansion, with the leading early implementation. The first experimental Bessemer steel rails in the were rolled in 1865 by the North Chicago Rolling Mill; the installed imported steel rails experimentally between Altoona and in 1864 and later used American-produced rails from the Pennsylvania Steel Company starting in 1867. Early tests showed steel rails could last several times longer than iron rails, with estimates ranging from 3 to 10 times depending on traffic and conditions, significantly lowering maintenance costs. Initial steel rails measured about 30 feet in length and weighed 50 to 60 pounds per yard, balancing manufacturability with structural integrity for the era's locomotives. The economic viability of steel rails improved rapidly as production scaled, with prices dropping from over $100 per gross ton in the late to around $50 per ton by 1875, driven by technological refinements and increased output from Bessemer converters. This cost reduction, combined with 's higher tensile strength—reaching up to 100,000 pounds per square inch compared to wrought iron's roughly 50,000 psi—minimized rail fractures and derailments, enhancing overall track safety and reliability on high-traffic lines. These material advantages paved the way for subsequent of rail profiles as production volumes grew.

Standardization of Rail Profiles

The standardization of railway rail profiles began in the late 19th century as production advanced, enabling more uniform cross-sectional shapes to support increasing loads and speeds while minimizing wear and improving stability. One early influential design was the Vignoles rail, a flat-bottomed profile invented by British engineer Charles Vignoles in 1836 for the London and Croydon Railway, featuring a broad base for direct attachment to ties and a slight upward incline on the bottom edges to enhance load distribution. This profile, originally developed for iron rails, was adapted for in subsequent decades, becoming the basis for modern flat-bottom rails widely used outside the due to their simplicity in manufacturing and fastening. In parallel, two primary rail profiles emerged: the bullhead rail, characterized by a symmetrical head and foot of similar size held in cast-iron chairs, and the flat-bottom (or Vignoles) rail, with an asymmetrical T-shape where the foot flares out broadly for spiking directly to sleepers. The bullhead design predominated in the UK for its ability to be inverted for even wear, while the flat-bottom profile gained favor in the United States and elsewhere for easier installation on wooden ties. In the US, the American Society of Civil Engineers (ASCE) established initial standardization of T-rail sections in 1893, specifying profiles in 5-pound-per-yard increments from 40 to 100 pounds per yard, with a weight distribution of approximately 42% in the head, 21% in the web, and 37% in the foot to optimize strength and wheel contact. This was refined by the American Railway Association (ARA, predecessor to AREMA) in 1909, which defined standard sections in 10-pound increments from 60 to 120 pounds per yard, facilitating interchangeability across railroads. Rail lengths also standardized during this period to reduce joints and maintenance; by 1900, 60-foot sections became common in the US, as seen on lines like the , compared to earlier 30-foot rails from the 1850s. Weights increased to accommodate heavier locomotives, reaching up to 100 pounds per yard by 1900 on major US routes such as the New York Central, providing greater durability under loads exceeding previous iron rail capacities. In the UK, the Engineering Standards Committee of the introduced the first standard bullhead sections in 1904 under what evolved into British Standard (BS) specifications, such as BS 9 and later BS 11, which emphasized head widths of around 2.5 to 3 inches to ensure reliable wheel flange contact and lateral guidance. These international variations in profiles integrated with evolving fastening systems, such as spikes for flat-bottom rails and chairs for bullhead, to secure rails effectively to sleepers.

Supporting Track Elements

Sleepers and Cross-Ties

Sleepers, also known as cross-ties in , serve as the foundational supports beneath railway rails, distributing loads and maintaining alignment. In the , early British waggonways employed wooden sleepers to underpin wooden rails, as seen in systems like the Bedlam Furnace tramway in during the 1750s, where oak sleepers pegged oak rails in place. These transverse timber blocks, typically spaced 2 to 3 feet apart center-to-center, provided basic stability on rudimentary tracks, often paired briefly with minimal for drainage and firmness. By the 1820s, stone blocks emerged as an alternative in the , offering durability on permanent ways; the Stockton & Railway, opened in 1825, utilized stone sleepers at locations like the Brusselton incline to support cast-iron rails without obstructing horse paths between tracks. This shift addressed wood's susceptibility to rot in exposed conditions, though stone's rigidity limited flexibility under load. In the United States, the completed in 1869 relied on untreated oak ties for much of its construction, with the Central Pacific specifying 8-foot-long oak or equivalent hardwoods hewn to 6 by 6 inches, sourced from redwoods and Sierra Nevada mills to span the demanding Sierra Nevada terrain. The mid-19th century marked a transition to treated wooden sleepers, driven by preservation needs amid expanding networks; the Bethell process, patented in 1838, enabled pressure impregnation with coal-tar , which gained traction in the for enhancing longevity against decay and insects in railway applications. This treatment extended service life significantly, becoming standard for and other hardwoods by the late 1800s. Sleeper spacing evolved concurrently, narrowing from the initial 2-3 feet to optimize load distribution as rail weights increased, reaching a standardization of 20-25 inches center-to-center by the on many main lines to reduce rail bending stresses. Concrete sleepers represented a pivotal innovation, with French inventor Joseph Monier patenting reinforced concrete designs in 1877 to replace vulnerable wood under heavy traffic. Though initial adoption was slow due to manufacturing challenges, concrete sleepers proliferated post-1900, offering superior resistance to wear and eliminating treatment needs; by the early 20th century, they appeared in experimental installations in Europe and Japan, transitioning to widespread use on high-speed lines after World War I for their uniformity and longevity.

Ballast and Foundation Development

Early railway tracks prior to were typically constructed without , relying on unballasted earth formations or massive stone blocks laid directly beneath the sleepers to provide rigidity and support. For instance, George Stephenson's in used stone blocks fastened to the ground, but these systems often failed rapidly due to instability and poor drainage under load. The introduction of in the during the 1840s marked a significant advancement, utilizing readily available materials such as discharged from ships' holds on for early tramways and railways. This provided a more stable and drainable foundation compared to bare earth, reducing track settlement and improving load distribution. By the 1870s, the shift to stone , including crushed and other hard angular stones, further enhanced drainage and lateral stability, as these materials interlocked effectively to resist movement. Crushed , prized for its durability, became a preferred choice in regions with suitable quarries, contributing to longer track life and reduced maintenance needs. By 1900, depth standards had evolved to typically 12-18 inches (300-450 mm) to ensure adequate support and drainage, with emphasizing angular stone sizes of 20-50 mm for optimal packing. These depths allowed for better load transfer to the while facilitating water runoff to prevent weakening. In the post-1950s era, the adoption of tamper machines revolutionized maintenance, compacting more efficiently than manual methods, though this required adjustments to stone size for effective operation. Challenges with ballasted tracks persisted in dry regions, where dust generation from traffic and tamping contaminated the , reducing its effectiveness and necessitating frequent cleaning. This issue prompted the exploration of alternatives in the , particularly from the mid-century onward, as ballastless systems offered superior control and longevity in arid environments. also contributes to gauge maintenance by providing lateral resistance to prevent rail spreading.

Rail Fastening Systems

The earliest methods for securing rails to sleepers emerged in the early , with iron spikes becoming a primary fastening mechanism by the . Robert Livingston Stevens, president of the Camden & Amboy Railroad, invented the railroad spike in 1832, enabling direct nailing of flanged T-rails to wooden sleepers and eliminating the need for intermediate chairs in many designs. These hand-forged iron spikes, typically with an offset head for better grip, were essential for the rapid expansion of American railroads, though their production scaled up via machines to meet demand. For bullhead rails, which featured symmetrical heads and feet for reversibility, cast-iron chairs were mounted on sleepers, and rails were secured within these chairs using wooden keys or wedges. These keys, often made from and compressed for durability, wedged the rail tightly to prevent movement while allowing for some adjustment during installation. This system, prevalent in British railways from the mid-19th century, provided stability for heavier traffic but required periodic replacement of the wooden components due to wear. The introduction of spring clips in the 1870s marked a shift toward more resilient fastenings, using elastic metal elements to maintain tension without rigid wedging. These early clips improved rail alignment and reduced loosening from vibration compared to spikes or keys. By the mid-20th century, elastic systems advanced further; the Pandrol PR clip, patented in 1957 by Norwegian engineer Per Pande-Rolfsen, exemplified resilient fastening by applying consistent downward and lateral force through a spring-steel design. Elastic fastenings proliferated in the 1950s, incorporating rubber pads and clips to isolate vibrations. This progression enhanced track longevity and passenger comfort, particularly on high-speed lines. Modern standards typically require one per rail end to secure the base, incorporating some lateral play to accommodate and contraction without . These systems are also adapted briefly for switches, where clips ensure secure attachment amid frequent directional changes.

Track Gauge Variations

Origins of Early Gauges

The origins of early railway track gauges trace back to the 18th-century wagonways used in English collieries, where wooden rails guided horse-drawn carts carrying . These primitive systems typically employed track widths between 3 and 5 feet, determined by the lengths of existing mine wagons to ensure compatibility and minimize . George Stephenson played a pivotal role in formalizing one such gauge for steam-powered railways. In 1825, for the —the world's first public steam railway—he adopted a gauge of 4 feet 8 inches, directly derived from the Killingworth colliery where he had worked. This measurement accommodated the existing horse-drawn wagons in the region, facilitating a smooth transition to locomotive haulage. By 1829, Stephenson refined it to 4 feet 8.5 inches for the , adding the half-inch to improve wheel clearance on curves and reduce wear, establishing what became known as the standard gauge. Despite this emerging standard, regional practices led to significant variations in the early 19th century. In the , by 1840, over 100 different lines incorporated a wide array of gauges, ranging from 4 feet (as on the and Railway in 1825) to 5 feet 6 inches ( and Railway in 1838), reflecting local preferences and the lack of national regulation. These discrepancies arose from the decentralized nature of railway development, where proprietors prioritized compatibility with pre-existing over uniformity. Across the Atlantic, early American railways mirrored this diversity, influenced by British exports but adapted to local needs. The first U.S. lines, such as the in 1828, initially used 4 feet 8½ inches (1,435 mm), while by the , Southern states like Georgia and standardized on 5 feet for their cotton-hauling networks, as it better suited heavier loads and broader wagons. Factors such as varying lengths from imported and domestic carts, along with regional economic standards, perpetuated these choices, resulting in over 20 distinct gauges by mid-century and complicating interstate connectivity.

Broad Gauge Adoption and Challenges

In 1835, , as chief engineer of the Great Western Railway (GWR), selected a broad gauge of 7 ft (2,134 mm)—later refined to 7 ft ¼ in (2,140 mm)—for the line connecting to , aiming to enhance stability and enable higher speeds compared to the prevailing narrow gauges. This choice reflected Brunel's vision for a system optimized for emerging technology, allowing for smoother rides and reduced oscillation at velocities exceeding 50 mph. The broad gauge offered several advantages, including the capacity for wider, more stable carriages that minimized risks and supported greater freight and loads per . These benefits influenced adoptions elsewhere: standardized on a 5 ft (1,524 mm) gauge in to leverage similar stability for its expansive network, while and established the Iberian broad gauge of approximately 5 ft 6 in (1,668 mm) in the , based on local measurements like the Castilian foot, to facilitate larger rolling stock suited to mountainous terrain. Despite these merits, broad gauge systems faced severe challenges, notably the "break-of-gauge" at network boundaries, where differing track widths necessitated unloading and reloading of or transferring passengers, leading to inefficiencies, delays, and elevated operational costs. A prominent example occurred in 1844 at station, where the GWR's broad gauge met the standard-gauge Birmingham and Gloucester Railway, resulting in bottlenecks that highlighted the logistical chaos of mixed-gauge operations across Britain. Such issues prompted parliamentary intervention, including the 1846 Gauge Commission, which recommended to the 4 ft 8½ in (1,435 mm) gauge. The decline of broad gauge accelerated as economic pressures favored uniformity; the GWR's full conversion to standard gauge, completed over the weekend of May 20–23, 1892, involved relaying thousands of miles of track at a cost of millions of pounds, marking the end of Brunel's system in Britain. By , the 4 ft 8½ in standard had become the global norm for most major rail networks, driven by needs and the dominance of British engineering exports, though exceptions persisted in (5 ft 6 in) and (Iberian gauge) due to entrenched infrastructure.

Modern Track Innovations

Switches, Crossings, and Turnouts

The development of switches, crossings, and turnouts began in the early with rudimentary mechanisms suited to nascent systems. In the , early wooden railways employed simple single-bladed wooden switches to divert wagons between tracks, often manually operated by levers or bars directly adjacent to the rails. These basic designs, common on plateways and tramroads like those in England's regions, facilitated the movement of horse-drawn loads but were prone to misalignment and wear due to their timber construction. By the 1840s, as steam locomotives increased speeds and loads, cast-iron crossings—known as frogs—emerged to handle intersecting tracks more reliably than wooden alternatives. These cast-iron components formed the V-shaped intersection where wheels transitioned between rails, providing greater durability for emerging mainline networks, though their brittleness limited use under high-stress conditions. Distinctions in switch orientation became standardized with the concepts of , where allowed trains to diverge onto and enabled convergence onto main routes. This terminology, rooted in 19th-century British railway practices, emphasized safety differences: facing movements posed higher risks if misaligned, leading to mandatory locking mechanisms by the . By the , double-slip designs integrated both in a compact crossing configuration, ideal for dense urban or yard networks to permit bidirectional diverging and converging without excessive space. A pivotal safety advancement arrived in the 1870s with the Saxby & Farmer system, which mechanically linked switch and signal levers to prevent conflicting routes. Invented by John Saxby in 1856 and commercialized through his partnership with John Stinson Farmer from the 1860s, the system used a frame of interconnected rods and latches to ensure points could not be thrown while signals indicated clear, dramatically reducing collision risks on complex junctions. By the late 1870s, Saxby & Farmer machines dominated British installations and were exported worldwide, becoming a cornerstone of technology. Post-1900 innovations focused on material enhancements for longevity amid heavier traffic. Manganese steel frogs, prized for their work-hardening under impact, were first cast for street railways in the late but extended to mainline turnouts around 1900, with the inaugural rail-bound manganese frog installed on the Railroad's Terminal that year. This evolution addressed wear at crossings, where traditional failed rapidly; by 1905, refined designs from Ramapo Iron Works featured integrated rail sections, and by the 1920s, American Railway Engineering Association standards solidified manganese frogs as the norm for high-wear applications. Adaptations for continuous welded rail later incorporated flexible joints in these components to accommodate .

Continuous Welded Rail Implementation

The implementation of continuous welded rail (CWR) began in the early as a response to the limitations of jointed rails, which caused vibrations, noise, and frequent maintenance issues due to wear at connections. Early experiments focused on short sections to create jointless segments, eliminating the "clickety-clack" sound and bumps associated with traditional 39-foot rails. , the first short-welded rails, typically 100-200 feet in length, were installed in , with the Delaware & Hudson Railway laying an experimental quarter-mile section in 1937, dubbed the "Velvet Track" for its smooth ride. Similar short-welded implementations occurred in the during , though widespread adoption lagged behind the US until after , when full CWR became standard on main lines in both countries. Key to CWR's development were advances in welding techniques, particularly flash butt welding, which emerged in the and allowed for stronger, more efficient rail joints. This method, involving electrical resistance to heat and upset the rail ends, replaced earlier gas-pressure and processes, enabling the fusion of longer rail segments with minimal defects. By reducing the number of joints by approximately 90% compared to jointed track—where dozens of connections per mile required regular and replacement—flash butt welding significantly streamlined track construction and upkeep. Post-WWII, these methods facilitated the production of factory-welded rails up to 1,440 feet long, transported via specialized rail for field installation. The primary benefits of CWR include a smoother ride for passengers and freight, as the absence of joints minimizes vertical oscillations and noise, allowing trains to operate at higher speeds with greater stability. Maintenance costs are also lowered, with reduced wear on rail ends, wheels, and fasteners, leading to longer track life and fewer disruptions; for instance, joint-related failures, which accounted for a substantial portion of pre-CWR repairs, were largely eliminated. By the , CWR lengths had extended to 1/4 mile (about 1,320 feet) or more on main lines in the and , further amplifying these advantages through even fewer field welds. Managing in CWR is critical to prevent in heat or excessive tension in cold, as the continuous structure lacks expansion gaps found in jointed rails. Rails are pre-stressed during installation by heating them to a neutral —typically 90-110°F (32-43°C)—at which point they are anchored to sleepers with clips or fastenings, inducing to counteract summer expansion. This neutral range, set near the midpoint of annual ambient extremes (often 0-120°F), ensures the rail remains stable; for example, guidelines recommend monitoring and adjusting to maintain this balance, avoiding derailments from track distortion.

Twentieth-Century Advancements (1900-1945)

The early saw the adoption of heavier rails, often exceeding 100 pounds per yard, to support the increased loads and speeds associated with projects. These rails were essential for handling the higher tractive efforts of electric locomotives, which demanded greater structural integrity to prevent deformation under heavy loads. , for instance, electrified lines like those of the incorporated 105-pound rails by the 1910s to accommodate multi-unit electric operations. Concrete sleepers emerged as a durable alternative to wooden ties in during the , driven by timber shortages and the need for longevity in electrified and high-traffic lines. Initial experiments dated back to 1906 on the Nuremberg-Bamberg line, but by the mid-1920s, the began installing sleepers on a larger scale, particularly in the region, where they proved resistant to industrial pollution and rot. These early designs featured post-tensioned monoblocks, laying the groundwork for broader European adoption amid interwar infrastructure upgrades. In the United States, the 1930s marked the introduction of mechanical ballast tamping machines, revolutionizing track maintenance by automating the compaction of under sleepers for improved stability. Developed by companies like the Jackson Manufacturing Company, these early machines used vibrating tines to pack efficiently, reducing manual labor and enabling faster alignment on main lines. By 1938, the Association of American Railroads tested prototypes that could tamp up to 1,000 feet of track per hour, addressing the wear from heavier freight traffic during the era. Wartime demands during World War I and II accelerated the use of prefabricated track sections for rapid repairs, allowing quick restoration of damaged lines under combat conditions. French engineer Paul Decauville's system, featuring portable 2-foot gauge panels with steel sleepers, was widely employed by Allied forces in 1914-1918 to build supply routes behind the front lines, with sections assembled by small teams in hours. Similar prefab methods persisted into World War II, where U.S. Military Railway Service units deployed modular track panels in Europe and the Pacific to bypass sabotage and bombing, facilitating over 22,000 miles of operational rail. Post-World War I, several European countries undertook gauge conversions to standardize networks fragmented by conflict, enhancing for reconstruction efforts. In , approximately 3,000 kilometers of Russian broad gauge (1,520 mm) captured during the war were converted to standard gauge (1,435 mm) by 1925, primarily in eastern territories to integrate with the Reichsbahn system. like also shifted from Russian to standard gauge in the early amid , though reversions occurred later; these changes involved lifting and re-spacing rails, often using temporary dual-gauge plates to minimize downtime. World War II imposed unprecedented stress on railway tracks across and , with intensified military traffic causing accelerated wear, including rail head battering and sleeper degradation at rates up to five times peacetime levels. In response, belligerents developed standardized maintenance protocols, such as the U.S. Army's emphasis on daily inspections and emergency kits, to sustain ; these protocols, documented in manuals, prioritized rapid fault detection to keep supply lines operational amid Allied advances. This era's innovations, including preliminary techniques, transitioned into post-war track rehabilitation efforts.

Post-War and Contemporary Developments

Long Welded Rails and Expansion Joints

Following World War II, advancements in welding techniques enabled the development of long welded rails (LWR), which extended rail lengths to approximately 1/4 to 1/2 mile (400–800 meters) to minimize joints and enhance track smoothness. These rails addressed limitations of shorter welded panels used in the pre-war era by reducing maintenance needs and improving load distribution, with initial implementations focusing on ballasted tracks. Expansion joints, such as breather switches or switch expansion joints (SEJ), were incorporated at the ends of LWR sections to accommodate thermal movements, typically featuring an initial gap of around 40 mm at the destressing temperature to prevent excessive stress buildup. Stress management in LWR installation emphasized destressing procedures to mitigate risks from and contraction. During laying, rails were heated or tensioned to a neutral (often 35–45°C in tropical regions), eliminating installed stresses and allowing controlled expansion—up to about 1/2 inch per 100 feet for rises of 40–50°C, depending on rail properties. This process, involving rail cutting, gap adjustment, and rewelding, ensured longitudinal forces remained below critical thresholds, with lateral resistance from and fastenings providing stability against modes like sun kinks during heatwaves. Adoption of LWR accelerated in the and as post-war reconstruction prioritized efficient infrastructure. In , experimental short welded panels evolved into LWR by the late following a 1966 Railway Board recommendation, with widespread use on broad-gauge lines to support growing traffic. In the United States, LWR became standard on main lines in the and , building on early continuous welded rail trials from the 1930s, leading to smoother rides and reduced wheel-rail noise and vibration compared to jointed tracks. These benefits included halved maintenance intervals and extended rail life, though challenges like sun kinks—lateral distortions from uneven heating—emerged in extreme conditions, prompting 1970s introductions of strain gage monitoring and neutral temperature surveys spaced 300–500 feet apart. LWR systems integrated briefly with emerging slab track designs in the late twentieth century, where foundations further constrained movements but required adapted expansion joints for thermal relief.

High-Speed and Slab Track Systems

The development of ballastless slab track systems emerged in the 1960s to support operations exceeding 200 km/h, addressing limitations of traditional ballasted tracks such as instability and frequent maintenance under dynamic loads. These systems replace with a rigid foundation, allowing direct fixation of rails to the slab for enhanced precision and durability. Early innovations focused on and , where research prioritized vibration control and geometric stability essential for sustained high velocities. In , following the 1964 opening of the on ballasted track, the initiated slab track studies in 1965 to accommodate increasing speeds and traffic demands. The first operational slab track was installed in 1972 as a 12 km trial section on the Sanyo Shinkansen line, featuring direct rail fixation to a slab without sleepers, which provided superior stiffness and reduced settlement. This design, known as the Japanese slab track, used a 2.3 m wide, 190 mm thick slab poured on-site with rails embedded during construction, enabling reliable service at up to 210 km/h initially and influencing subsequent expansions. European advancements paralleled Japan's efforts, with pioneering the Bögl system in 1977 through a test installation near Dachau. This prefabricated approach utilized 6.45 m long twin-block concrete slabs, each weighing about 9 tons, with rails fixed directly via clips and mortar bedding to ensure precise alignment and load distribution. Although the French network launched in 1981 on the primarily employed ballasted track for cost reasons, the Bögl system gained traction in other European high-speed projects, such as German lines from the 1990s, where concrete was cast around integrated components for added rigidity. Slab track systems offer key advantages for high-speed applications, including exceptional lateral and vertical stability at speeds over 320 km/h (200 mph), which minimizes oscillations and wear compared to ballasted alternatives. Maintenance needs are significantly reduced—often to near zero for routine adjustments—due to the absence of degradation, though initial construction costs are 20-30% higher owing to and specialized installation. The adoption of slab tracks expanded globally in the early , notably in , where the CRTS (China Railway Track System) series was developed from 2004 onward, drawing on Japanese and German technologies. By 2008, CRTS I and II slab variants were deployed on the Beijing-Tianjin high-speed line, the nation's first dedicated passenger HSR, supporting operational speeds of 350 km/h with modular prefabricated slabs and adjustable fasteners for control. This facilitated rapid network growth, with over 50,000 km of HSR as of 2025 incorporating CRTS for consistent performance across standard gauge lines.

Recent Innovations (1945-Present)

In the post-World War II era, global railway track design has increasingly emphasized standardization to facilitate and economic efficiency, with the 1,435 mm standard gauge emerging as the dominant configuration worldwide, used on approximately 60% of the global rail network. This dominance stems from historical adoption in major economies like , , and much of , enabling seamless cross-border operations and reducing the need for gauge-changing infrastructure. Building on the stability principles of continuous welded rail (CWR) systems, modern tracks have evolved to include advanced guideway designs, such as those for (maglev) systems, which represent a departure from traditional wheeled tracks. In , the Yamanashi Maglev Test Line, operational since 1997, introduced superconducting maglev guideways that levitate vehicles using electromagnetic forces, eliminating physical rail contact and achieving speeds over 500 km/h while minimizing wear and energy loss. Efforts to reduce track weight and enhance durability have led to the testing of composite materials for rail components in the , including aluminum- composites designed for lighter weight and improved conductivity in conductor rails integral to electrified tracks. These bi-metallic designs, which bond aluminum for low weight and for structural integrity, offer up to 50% weight reduction compared to traditional while maintaining resistance, with prototypes tested in urban and high-speed applications. Complementing these advancements, recycled plastic ties gained traction in the as a sustainable alternative to wooden or sleepers, with early commercial adoptions in addressing timber shortages and environmental concerns. By the mid-, manufacturers like introduced 100% recycled plastic composite ties, which resist rot, insects, and chemical degradation, extending service life beyond 50 years in demanding conditions. The integration of has transformed track maintenance since the 2010s, particularly through sensor-embedded systems for . In the United States, the (FRA) has spearheaded initiatives like the Smart Track project in the 2020s, deploying wireless sensors along tracks to monitor geometry, temperature, and stress in real-time, enabling automated alerts for defects before failures occur. These embedded fiber-optic and acoustic sensors, often integrated into rail or , help predict issues like rail breaks or ballast shifts. Sustainability has driven material innovations post-2010, with bio-based sleepers and low-carbon emerging to lower the environmental impact of track construction. Bio-based sleepers, derived from renewable sources like treated wood or composites, were advanced through international efforts such as the UIC's SUWOS launched in 2010, promoting preservative-free wooden alternatives that reduce chemical leaching while maintaining load-bearing capacity equivalent to traditional timber. Concurrently, low-carbon formulations for sleepers, incorporating geopolymers or reduced-clinker cements, have achieved emissions reductions of around 40% compared to conventional mixes, as demonstrated in lifecycle assessments of prestressed sleeper production. These innovations cut CO2 outputs during —typically the largest contributor to track infrastructure emissions—while preserving structural performance for high-traffic lines.

References

  1. https://en.wikisource.org/wiki/The_Working_and_Management_of_an_English_Railway/Chapter_4
Add your contribution
Related Hubs
User Avatar
No comments yet.