Knox played an important role progressing the study of anatomy in Britain. Working out of 10 Surgeon Square, Knox and his practice’s immense demand for corpses also played an important part in the murderous saga of Scotland’s most famous body snatchers, Burke and Hare.
William Burke and William Hare’s 1828 murders are the subject of a History Hit film investigating the notorious serial killers, presented by the After Dark podcast’s Maddy Pelling and Anthony Delaney.
“Doctor Robert Knox was the superstar anatomy teacher in Edinburgh in his day,” Cat Irving, Human Remains Conservator at Surgeons’ Hall Museums, explains in the documentary. Knox practised the Paris manner of dissection, which meant that students would dissect cadavers themselves, rather than observe a teacher.
“Everyone was enthusiastic about his teaching, they came away really knowing what they were doing inside the human body.”
Of course, independent anatomists like Knox, who were linked to but not part of the university, required a supply of bodies. As many as 90 cadavers were needed for a year of Robert Knox’s classes.
There was a legitimate supply for cadavers and these came from executed criminals. However, as a private anatomy school, “they’re not entitled to any of that legitimate supply, and the legitimate supply wasn’t enough for the medical schools at the time,” says Irving.
They still somehow secured cadavers: an advert for one of Robert Knox’s classes, includes the reassurance that “arrangements have been made to secure as usual an ample supply of Anatomical Subjects.”
The Edinburgh Murders: Burke and Hare
Image Credit: History Hit
“They’re definitely having to get more underhand methods of body supply,” says Irving. “We’re talking about the body snatchers. We’re talking about bribing people in hospitals, undertakers, things like that.”
These illicit corpses cost Knox dearly – seven or eight pounds sterling for one body.
Burke and Hare recognized the anatomists’ growing demand for bodies. Up to this point, they had dealt in the recently dead. Burke had sold a recently deceased lodger in his house to Knox for £7 and 10 shillings. Their opportunism then took a darker turn. Their subsequent murder rampage took the lives of 16 people, their bodies sold for the anatomist’s table.
After a media frenzy, forensic investigation, and trial, Hare walked free after serving as the state’s witness, while Burke was executed and publicly dissected. (His skeleton ended up displayed in Edinburgh Medical School.)
How aware was Robert Knox that his school’s demand for bodies was fuelling not just a clandestine but murderous trade in cadavers? “It seems very likely that he would have some inkling of what was going on,” says Irving. “But he escapes legal justice in that sense. He was never prosecuted.”
Though Burke signed a confession saying Knox had no knowledge of the murders, “the public certainly felt he was guilty,” says Irving. An enraged Edinburgh crowd hung an effigy of Knox and demanded he faced justice, but a committee cleared him of complicity.
Burke and Hare’s murder rampage through the streets of Edinburgh is explored in The Edinburgh Murders: Burke and Hare on History Hit.
]]>Julia Pastrana first emerged on the freak show circuit in the mid-19th century, captivating audiences with her distinct appearance. Often referred to as the “Bearded Lady” or the “Bear Woman” due to her condition, hypertrichosis, which caused excessive hair growth across her body and face, she became a sought-after attraction. Theodore Lent, a showman with an eye for profit, saw an opportunity in Julia and took control of her career, managing her performances across Europe.
By 1855, Lent married Julia, further solidifying his control over her life and finances. Together, they continued to tour, with Lent most likely taking all of the proceeds from her performances. But Julia Pastrana was more than just an object of curiosity. She was a multi-talented woman who defied the limitations society placed on her. John Woolf, a guest on Kate Lister’s Betwixt the Sheets, said
In 1855, he married her and they performed around Europe…he most likely took all the proceeds. She spoke numerous languages, could ride on horse back and was a great singer.
In 1860, Julia Pastrana gave birth to a baby boy who inherited her condition. Tragically, both mother and child died shortly after the birth, cutting short the life of a woman who had endured so much. For most, the story would have ended there, but for Theodore Lent, Julia’s death marked a different kind of opportunity. Faced with the loss of his primary source of income, Lent made a chilling decision: he had his wife and child embalmed, turning them into a macabre exhibit.
Woolf tells Kate that
She gave birth to a boy who had the same condition as her. Heartbreakingly they both died and Lent saw his opportunity of income slipping away.
For years after their deaths, Theodore Lent continued to display Julia Pastrana and her son to audiences across Europe, refusing to let death be the end of the show. It was a disturbing chapter in an already grim story of exploitation, as Lent paraded their preserved bodies in front of paying crowds, further dehumanizing the woman he had once called his wife.
Poster showcasing Julia Patsrana at the show
Image Credit: wellcomeimages.org
Theodore Lent’s obsession with profiting from those he controlled did not end with Julia Pastrana. In the 1860s, he married another bearded woman named Marie Bartell, whom he presented as Julia’s sister. This was yet another attempt to capitalize on society’s fascination with physical difference, continuing his pattern of exploitation.
Julia Pastrana’s story serves as a stark reminder of the cruelty faced by many who were part of the freak show industry. Reduced to mere spectacles, their humanity was often ignored or dismissed in favor of profit. But in recent years, there has been a growing recognition of the injustices they endured.
In a symbolic act of redemption, Julia Pastrana’s remains were finally repatriated to her native Mexico in 2013. After more than 150 years of posthumous exploitation, she was laid to rest with dignity, allowing her story to come to a more peaceful conclusion.
Julia Pastrana’s life was one of resilience in the face of unimaginable challenges. Despite the cruelty she endured, she was a woman of talent, intelligence, and strength. Her story is a haunting reminder of the ways in which society can fail those who are different, but it also speaks to the enduring human spirit. Today, she is remembered not only as a figure in the history of freak shows but as a woman who deserves to be seen beyond the spectacle.
]]>Coca leaves had been a central feature of Incan cultural life centuries before Europeans adopted the substance in the late 19th century. But the key moment in the popularisation of the coca leaf in Britain came with the rise of competitive long-distance walker Edward Payson Weston.
Weston was an American participant in the spectator sport of pedestrianism who established his celebrity when he walked nearly 500 miles from Boston to Washington D.C. in 1861 in a little over 10 days.
“He came to dominate the world of this very strange Victorian sport, essentially competitive long-distance walking,” says Dr Douglas Small on Dan Snow’s History Hit podcast. “This sounds remarkable to us now but the Victorians absolutely went mad for this.”
Women gathering leaves of the coca plant (Erythroxylum coca) in Bolivia. Wood engraving, c. 1867.
Image Credit: Wellcome Collection / Public Domain
When he visited Britain in 1876, some 5,000 people watched him compete in a 24-hour championship race against Englishman William Perkins. After winning the race, Weston revealed that his doggedness had been fuelled, in part, by munching on coca leaves.
“That’s actually what really moves coca for British people from being something that’s occasionally discussed in travellers’ tales, something that’s mentioned every now and again in accounts of life in South America, to being something that people are really interested in,” explains Small, a historian and author of Cocaine, Literature, and Culture, 1876-1930.
“[It] almost becomes for a while like tea and coffee, something that people really want to use in their daily lives.”
By this point the use of steamships across the American continent and the Atlantic meant that the transport and supply of coca had become easier. With new demand, people began to acquire and use coca leaves in a way they hadn’t previously.
As a result, Victorians started chewing coca leaves as the Andeans had been doing for centuries. Coca consumers even filled Mincing Lane, the centre of London’s 19th century tea and spice trade, looking to purchase what had so recently been a rarity.
“Very quickly after Weston popularises their use they catch on amongst all kinds of sportsmen,” says Small.
“They start being advertised for bicyclists, other pedestrians. There are accounts that are written in the British Medical Journal that talk about how great it is for shooting parties because they apparently help to stabilise your nerves and give you a bit more pep and confidence which people say makes them much better shots.”
Illustration from ‘The Sportsman’s Cyclopaedia’ by TB Johnson, 1848.
Image Credit: Wellcome Collection / Public Domain
They were even given to difficult race horses before races.
A boom emerged in chewing coca leaves in the 1870s and 1880s. Yet this was mere foreshadowing for the later prevalence of cocaine, which commenced a few years later in 1884 thanks to innovations in the European chemical industry.
Cocaine is stronger in its effects than raw coca leaves. Sigmund Freud was among its advocates for use as a stimulant and therapy for morphine addiction. But it found lasting use as an effective local anaesthetic, an essential in medical science for decades. A century later, cocaine is one of the most criminalised substances on earth.
Listen to Dan Snow’s History Hit now or sign up to History Hit for advert-free listening, with early access and bonus episodes for subscribers.
]]>Here we explore the safety issues in 19th century shipping that Plimsoll wanted to address, his campaigning on maritime safety, and its ongoing impact.
Enormous growth in world trade meant 19th century merchant shipping became increasingly competitive. Despite the ‘Lloyd’s Rule’, introduced by Lloyd’s Register in 1835, stipulating that classed vessels should have a distance from the waterline to the weather deck of 3 inches of freeboard for every foot of depth in the hold, many transatlantic ships were still overloaded by their unscrupulous owners in order to maximise profits, as the rule was only optional.
Often overinsured, many of these overloaded wooden sailing ships were also often unseaworthy, worth more to their owners sunk than afloat. Usually old and riddled with wood-rot, woodworm and shipworm, many were repainted, renamed and falsely stated to be new ships.
The subsequent risks to crew members lives led to such ships being nicknamed ‘coffin ships’. Indeed at the time there had been over 2,000 cases of sailors who had signed on as crew being tried in court for refusing to board a ship upon seeing its condition, and in 1855 a group of sailors had even written to Queen Victoria complaining of being found guilty of desertion for complaining about going to sea in dangerous ships.
After leaving school early, Samuel Plimsoll became a clerk and later manager at Rawson’s Brewery. Yet having failed in his attempt to become a London coal merchant, Plimsoll was reduced to destitution in 1853 – an experience that helped him sympathise with the struggles of the poor. When his life picked up, he resolved to devote his time to improving their condition. After becoming a Liberal MP for Derby in 1867, Plimsoll investigated ship safety and was shocked upon discovering the scale of life lost at sea.
Aware of growing widespread concerns about the unsafe loading of ships and the many thousands of lives and ships being lost, together with his wife Eliza Plimsoll (an equal partner in the cause), Samuel led a decades-long legal, social, and political battle for justice against ‘coffin ships’. He campaigned to pass a bill for the introduction of a mandatory safe load line on ships.
Left: Samuel Plimsoll. Right: Portrait of Samuel Plimsoll (1824-1898), painted by Reginald Henry Campbell
Image Credit: Left: Wikimedia Commons / Public Domain. Right: Wikimedia Commons / Reginald Henry Campbell / Royal Museums Greenwich / Public Domain
Plimsoll was unsuccessful due to opposition from merchants and the number of powerful ship-owning MPs in Parliament. Undeterred, he published a book in 1872 called Our Seamen which detailed evidence of reckless overloading, the poor condition of boat hulls and equipment, undermanning, filthy crew accommodation, the prevalence of over-insurance and the deliberate sinking of unsound and unprofitable ‘coffin ships’.
Plimsoll’s book became nationally well-known, prompting a campaign that led to the appointment of a Royal Commission on Unseaworthy Ships in 1873, to assess evidence and recommend changes. While associated with Plimsoll, load lines had been used dating back to the 12th century in Venice, but it wasn’t until the 19th century that their use became more widespread.
In 1874 Lloyd’s Register made it a condition of their classification that a load line was painted on newly built awning deck steamers. This original load line was a diamond with a centre line and the letters ‘L’ and ‘R’ next to it, and aimed to show how low a ship could safely rest in water without the risk of sinking. However, this only applied to ships inspected by Lloyd’s Register, and other ships could do as they pleased.
A plimsoll line – load line mark and lines on the hull of a ship
Image Credit: Wikimedia Commons / Flickr by Brinki / cc-by-sa-2.0
In 1875 a government bill was introduced to address the problem, and although Plimsoll regarded it inadequate, resolved to accept it. However, after Prime Minister Benjamin Disraeli later announced the bill would be dropped, Plimsoll called members of the House “villains” and shook his fist in the Speaker’s face. Disraeli called for him to be reprimanded, but after the matter was adjourned for a week, Plimsoll apologised.
Nevertheless, many people shared Plimsoll’s view that the bill had been stifled by the pressure of the shipowners. Ultimately, the power of public feeling forced the government to pass the Unseaworthy Ships Bill, eventually resulting in the The Merchant Shipping Act 1876.
The Merchant Shipping Act 1876 required all foreign-going British vessels, coasting vessels over 80 tons and foreign ships using British ports to have compulsory deck lines and load lines marked on their hull to indicate the maximum depth to which the ship may be safely immersed when loaded with cargo. (This depth varies depending on the ship’s dimensions, cargo type, time of year, and water saltiness and densities it would encounter while at port and at sea. Once these factors have been accounted for, a ship’s captain can determine the appropriate ‘Plimsoll line’ needed for the voyage.)
Stringent powers of inspection were given to the Board of Trade to enforce this rule, however fierce opposition meant the act was misused by many as it was left to ship owners to decide where a load line was to be painted and to paint the lines themselves (with some even painting these on the ship’s funnel). To overcome this, data on vessels’ strength and construction was gathered by Lloyd’s Register surveyors, and used to draw up the UK’s Board of Trade Load Line Tables in 1886 to ensure the fixing of the position of the Load Line on all ships by law in 1890 – this line became known as the ‘Plimsoll Line’ in Britain.
Despite being re-elected at 1880 general election by a great majority, Samuel Plimsoll gave up his seat to William Vernon Harcourt, believing that Harcourt, as Home Secretary, could advance sailors’ interests more effectively. Having then been offered a seat by 30 constituencies, Plimsoll unsuccessfully stood in Sheffield Central in 1885, but later became estranged from the Liberal leaders, regarding them as having neglected the question of shipping reform. Nevertheless, thanks to Plimsoll’s campaigning, countless lives and ships have since been saved.
By the early 1900s, many countries had adopted their own loading regulations, yet in 1906, foreign ships were required to carry a load line if they visited British ports. In 1930, the first International Convention on Load Lines established an international solution. Later, in 1966, the International Maritime Organization (IMO), a UN agency responsible for ship safety, adopted a new Convention ensuring ships had enough reserve buoyancy and covering, allowing freeboard for a ship in different climate zones and seasons via a load line zone map:
Load line and Freeboard conference from the Lloyd’s Register publication, 100A1, 100A1, 1959
Image Credit: Lloyd’s Register Foundation
The original ‘Plimsoll line’ was a circle with a horizontal line through it to show the maximum draft of a ship. Additional marks have been added over the years, allowing for different water densities and expected sea conditions. Letters may also appear to the sides of the mark indicating the classification society that surveyed the vessel’s load line.
Load Line Mark and Lines and Timber Load Line Mark and Lines for power driven merchant vessels. (TF – Tropical Fresh Water, F – Fresh Water, T – Tropical Seawater, S – Summer Temperate Seawater (NB – The ‘Plimsoll Line’ and the ‘Summer Line’ are the same thing – all the other lines take their positions from there), W – Winter Temperate Seawater, WNA – Winter North Atlantic Prefix, L – Lumber, L ⦵ R – Lloyd’s Register)
Image Credit: Wikimedia Commons /
Now, when a ship is commissioned, the exact location of the load line is calculated by a classification society, its position on the hull is verified and a load line certificate is issued. Calculations take into account the route the ship will take, and the seasons and sea temperature conditions of the geographic locations the ship will pass through en-route to its destination to ensure its adequate stability. The basic symbol, of a circle with a horizontal line passing through its centre, is now recognised worldwide.
You can find out more about the history of Lloyd’s Register Foundation and their work supporting research, innovation and education to help the global community tackle the most pressing safety and risk challenges at www.lrfoundation.org.uk
]]>It was at the height of Britain’s global, imperial power that Cutty Sark embarked on treacherous journeys across the world’s biggest oceans. Not only was she the fastest vessel of her day, but she could carry a million pounds of tea from China to Britain to quench the thirst of the Victorian public. Vessels like the Cutty Sark were a central plank in Britain’s expanding networks of trade and commerce, which drove the empire’s growth in the 19th century.
By 1901, the year Queen Victoria died, the British empire embraced 12 million square miles of the globe. British merchants also thronged the wharves of ports outside of Britain’s possession, as in China, Syria and South America.
Photographed by Green, Allan C., 1926.
Image Credit: State Library Victoria / Public Domain
The goods these ships carried introduced Victorians to new products: tea from China and India; coffee from the Middle East; spices from Southeast Asia; textiles from Egypt; timber from Canada; and frozen meats from Australia and New Zealand.
These arrived as raw commodities from Britain’s colonies, and returned as manufactured goods, protected on the high seas by the Royal Navy. Although British merchant vessels had the Navy’s protection, they could not afford complacency when it came to speed and efficiency. Key to Cutty Sark’s fame and success was the state-of-the-art technology with which she was outfitted.
At the time, competing fleets utilised whatever technological advantages were available in order to dominate trade and commodities. Among the Cutty Sark’s forest of sheets and halyards is evidence of significant changes in shipbuilding.
Cutty Sark’s hull is among the sharpest among tea clippers, meaning it required ballast for stability when unladen. Constructed from teak above the waterline, rock elm below and with a keel of pitch pine, it also featured metal sheeting over its hull. This kept the hull cleaner, so it sailed faster. It also had a wrought-iron frame to which all external timbers were secured by bolts. This made it stronger and less susceptible to leaks which would occupy valuable crew time to remedy.
“To maintain their edge, shipbuilders and architects are having to pioneer and innovate new technologies and techniques of shipbuilding,” explains Max Wilson, Senior Archivist at Lloyd’s Register Foundation, a public-facing library and archive holding material concerning over 260 years of marine and engineering science and history.
which possesses one of the best archives of ships and ship-building in the world. “We see this starkly over the 19th century.”
A plan and survey report for the Cutty Sark.
Image Credit: Lloyd's Register Foundation
Designers aimed to maximise the speed of cargo delivery, harness cargo carrying capacity to bolster the safety of passengers and goods, and to increase the number of journeys ships could undertake. Ultimately, they aimed to increase their share of the merchant trade.
Cutty Sark was built by John Willis in Scotland in 1869 against a backdrop of great transformations in shipbuilding. Shipbuilding moved northwards in Britain, while motive power was shifting from wind to steam and wood construction was being replaced with iron and composite solutions.
As the ‘age of sail’ threatened to pass into memory, the Cutty Sark was a last throw of the dice. In 1840 steam ships made up about 4 percent of Britain’s merchant fleet: by around 1870, this had grown to around 20 percent. By 1890, this would be around 75 percent. The Cutty Sark was built as a way to demonstrate the power of sail and wind against steam power.
“The sailing ships were still very reliable at that time,” says Zach Schieferstein, Archivist at Lloyd’s Register Foundation. “Ships like the Cutty Sark that were built for a purpose of transporting tea were built for speed and travelling long distances, getting the first tea of the season ready to sell for those high premiums.”
The Cutty Sark’s fastest recorded speed was 17.5 knots, considerable for a container ship, and the furthest she travelled in a 24-hour period was around 350 nautical miles. “It wasn’t uncommon to make the journey from Shanghai to ports in Britain in about 100 to 120 days. For the time, it was really setting records.”
Helping her rack up these miles were 32 sails, which could stretch over 32,000 square feet, suspended on 11 miles of rigging. She carried an average of 26 crew and was larger than clippers that had come before: with a gross tonnage of 963, a length of 212.5 feet, breadth of 36 feet and depth of 21 feet.
Cutty Sark traded between China, Australia, later to South Africa and South America, and for a while held the record for journeying between Australia and Britain. Over the course of the century, tea had emerged as the national drink and the Cutty Sark became associated with tea races. The annual tea race was a Victorian sensation. A premium or bonus was paid to the ship that arrived with the first tea of the year. Clipper ships like Cutty Sark raced from China’s tea ports to London to fetch the highest price for its cargo. In 1866’s so-called ‘great tea race’, the progress of ships was reported by telegraph and followed in the papers, with bets placed on the outcome.
In this period there was a simultaneous explosion in the service industries attached to shipping.
For example, Edward Lloyd’s Coffee House had become known as the meeting place in the City for those seeking shipping intelligence. A committee called The Register Society, made up of underwriters and brokers, ship-owners and merchants who associated through Lloyd’s coffee house, was formed in 1760. The Register Book published by the Society, later to become Lloyd’s Register, provided critical information on vessel seaworthiness which was critical for merchants and underwriters assessing the risks of any one voyage. This was the true beginning of classification and Lloyd’s Register as the first classification society which now possesses a vast archive and library.
In 1760, merchants who met at Edward Lloyd’s London coffee shop established the Society for the Registry of Shipping. From 1764, it funded surveyors to list, rate and class the condition of vessels. This was the origin of the world’s first classification society in Lloyd’s Register, which today possesses one of the best archives of ships and ship-building in the world.
“It was born out of this desire to have reliable and up-to-date information on merchant shipping,” explains Schieferstein, “and to make it safer as well, for passengers, for cargo and for the crew.”
The first mention of the Cutty Sark in the Register Book, from the supplements section of the 1869 Register Book.
Image Credit: Lloyd's Register Foundation
Lloyd’s Register’s surveyors would assess a ship, using “A1” from 1768 to indicate a ship of the highest class. they thereby introduced the term “first rate” to denote quality. Cutty Sark was given this A1 rating.
Continuing developments in steam technology resulted in the sale of Cutty Sark, to serve first as a Portuguese cargo ship, and later as a training vessel in Cornwall and on the Thames. It was towed into its current dry dock in 1953 to become one of the nation’s most treasured heritage sites, whose story becomes richer with the documents and records collated by Lloyd’s Register Foundation.
You can find out more about the history of Lloyd’s Register Foundation and their work supporting research, innovation and education to help the global community tackle the most pressing safety and risk challenges at www.lrfoundation.org.uk
]]>But was that famous moment really when the game was invented? Here we take a look at the evolution of the sport, from its earliest versions, to the global phenomenon it is today.
Rugby’s origins can actually be traced back over 2,000 years to the Roman game of harpastum, derived from the Greek word ‘seize’, which involved handling a ball. This game may have been played during the Roman occupation of Britain in the 1st century BC.
Although codified at Rugby School, throughout medieval Europe and beyond, various forms of traditional football games with ball handling and scrummaging formations were played. Different regions had their own variations, including New Zealand’s Ki-o-rahi, Australia’s marn grook, Central Italy’s Calcio Fiorentino, and Japan’s kemari among others.
Main: “Football” match in Piazza Santa Maria Novella in Florence, between 1523 and 1605 by Stradanus, based on a design by Giorgio Vasari. Inset: “Harpastum”, a form of ball game played in the Roman Empire, circa 100 BC – 400 AD
Image Credit: Main: Wikimedia Commons / Stradanus / Giorgio Vasari / Public Domain. Inset: Wikimedia Commons / Public Domain
Various early ball games were played during the Middle Ages (5th to 16th century). In England during the 14th and 15th centuries, documents record young men leaving work early to compete for their village or town in football games, which could be fairly violent. Shrove Tuesday football matches in particular became annual traditions, and there were many regional variations, often taking place over a wide area, across towns, villages, fields, and streams.
These local games continued well into the 19th century until football for the common man was gradually suppressed, notably by the 1835 Highways Act which forbade the playing of football on highways and public land. However, the sport did find a home in English public schools, where it was modified into two main forms: a dribbling game primarily played with the feet (promoted at Eton and Harrow), and a handling game (favoured by Rugby, Marlborough, and Cheltenham).
A ‘Foot Ball’ game between Thames and Townsend clubs, played at Kingston-upon-Thames, London, Shrove Tuesday, 24 February 1846
Image Credit: Wikimedia Commons / Public Domain
The roots of modern rugby can be traced to Rugby School in Warwickshire, England. In 1749, the boy’s school moved from the town centre to a new 8 acre site on the edge of the town known as the Close, providing more space for the boys to exercise.
The football played there between 1749-1823 had few rules. Although touchlines had been introduced to demarcate the playing area, the game was still fairly hectic, with teams often consisting of around 200 boys. The ball could be handled, but running with the ball was not allowed; progress towards the opposition’s goal was made by kicking.
While there is some debate and legend surrounding the exact moment and individual responsible for the game’s inception, the story of William Webb Ellis is perhaps the most enduring. According to legend, in autumn 1823, a young William Webb Ellis disregarded the established rules of football and, during a game on the Close, picked up the ball and ran with it.
According to the rules of the day, the opposing team could only advance to the spot where the ball had been caught, and Ellis should have moved backwards to give himself enough room to either kick the ball up the field or place it for a kick at goal. Instead, Ellis’ impulsive act is said to have laid the foundation for the game of rugby as we know it today.
However, the veracity of this tale is debated. While it is known that Webb Ellis was a student at Rugby School at the time, there is no direct evidence aside from a citation by the Old Rugbeian Society in an 1897 report on rugby’s origins by Matthew Bloxam. Nevertheless, the symbolism of Webb Ellis’s actions has persisted, and he remains an iconic figure in the sport, with the Rugby World Cup trophy named after him.
Left: Webb-Ellis carries the ball during a school football match played in 1823. Right: The only known contemporary image of Webb Ellis, published in the Illustrated London News, 29 April 1854.
Image Credit: Left: Wikimedia Commons / Public Domain. Right: Wikimedia Commons / The Illustrated London News (issue 24, page 400) / Public Domain
Rugby’s lack of standardised rules resulted in a variety of playing styles, and a somewhat chaotic playing field.
Rugby School, which gave its name to the sport, played a pivotal role in rugby’s development. Encouraged by Rugby School’s influential headmaster Thomas Arnold (headmaster from 1828-1842), many pupils from this time were instrumental in the game’s expansion. By 1841, the rules and fame of ‘rugby’ had spread fast as Rugby School’s pupils moved on to university, (mostly to Oxford and Cambridge), prompting a need for standardisation.
In 1845, the first rugby rules – the ‘Cambridge Rules’ – were established by members of Cambridge University. These introduced the concept of the ‘scrummage’ (the precursor to the modern scrum), and prohibited handling the ball forward, shaping ‘Rugby Union’. These laid the groundwork for the Rugby School rules established in 1845, which played a significant role in shaping the modern sport.
By 1863, boarding schools and clubs had developed further rule sets. Increasingly, rugby was seen as a sport of British imperial ‘manliness’, associated with the education of young gentlemen in public schools and universities.
After graduation, many young men wanted to continue playing. Following the formation of the first football clubs in the mid-19th century, rugby gradually became institutionalised. Blackheath and the Edinburgh Academicals were some of the first rugby clubs to form in 1858, and club matches began in England when Blackheath played Richmond in 1863.
In 1863, representatives of leading football clubs met to attempt to establish a common set of rules, but disputes arose over issues like handling the ball and ‘hacking’ (a tactic of tripping opponents and kicking their shins). Both were allowed under rugby’s rules but prohibited in other forms of football.
Advocates for rugby, led by F.W. Campbell of Blackheath, staunchly defended hacking, considering it character-building and its abolition ‘unmanly’. Consequently, rugby did not adopt the rules established for the newly formed Football Association (FA), leaving rugby outside the FA’s jurisdiction. (Hacking was later abolished during the late 1860s).
However, the death of a player in a practice match in 1871 prompted members of leading rugby clubs to organise an official meeting. That same year, Rugby saw its first international match when Scotland faced England in Edinburgh, resulting in victory for Scotland. This historic game marked the beginning of international rugby and, combined with the official rugby club meeting, led to the formation of the Rugby Football Union (RFU).
(Hacking remained a part of the game at Rugby School, causing the school to delay its entry into the RFU until 1890.)
The “First international”, Scotland v England in Edinburgh, 28 March 1871
Image Credit: Wikimedia Commons / Public Domain
The first official university match was played in 1872, and graduates from these universities introduced rugby to other British schools. Former pupils, ‘Old Rugbeians’, who had joined the army officer class helped expand the game internationally. By 1886, the International Rugby Board (now ‘World Rugby’) was established, and rugby began to gain popularity among middle and working-class men.
As rugby became more standardised, it became renowned for strict adherence to the rules and the spirit of the game, with a strong emphasis on discipline, self-control, mutual respect, and fair play. By the late 19th century, rugby, along with cricket, was seen as a sport that cultivated the ‘civilised’ manly behaviour of the elite, instilling values of unselfishness, fearlessness, teamwork, and self-control.
In 1895, a significant split occurred in rugby when clubs in Northern England formed the Northern Rugby Football Union (NRFU). This stemmed from a dispute over player compensation and working-class participation. The NRFU allowed player payments, which were prohibited by the Rugby Football Union (RFU), rugby union’s governing body.
The NRFU’s establishment led to the creation of Rugby League in 1922. This introduced distinct rules, including a six-tackle rule and a focus on speed and agility. Rugby League gained popularity in England’s northern regions and parts of Australia, while Rugby Union continued to dominate in the south and expanded globally.
The division between Rugby Union and Rugby League persisted for decades, with each developing its own distinct culture, traditions, and following. It was only in the late 20th century that Rugby League began to regain ground, especially in Australia and New Zealand.
Over the years, more nations embraced rugby, leading to the establishment of international competitions like the Six Nations and the Rugby Championship. In 1900, Rugby Union became an Olympic sport, and by 1908, three major Southern Hemisphere nations – New Zealand, Australia and South Africa – played international matches against Northern Hemisphere nations.
While rugby was later dropped from the Olympics in 1924, the inaugural Rugby World Cup was held in 1987. Additionally, Rugby Sevens (played with smaller teams in matches lasting 14 minutes) has been featured in the Olympics since the 2016 Rio Olympic Games.
The interior of Twickenham Stadium in 2012, England’s home stadium
Image Credit: Wikimedia Commons / Photo by DAVID ILIFF. / License: CC BY-SA 3.0
In the late 20th century, commercialism and television’s growing influence led to the professionalisation of rugby, allowing players to earn a living from the sport, and raising its standards and global appeal. While historically a sport for men (despite women’s games being played as early as the 1880s), Rugby Union has made progress in promoting women’s rugby, with the Women’s Rugby World Cup, beginning in 1991, instrumental in advancing the women’s game.
Rugby continues to evolve, with new nations emerging as competitive forces on the international stage. Japan’s impressive performance in the 2019 Rugby World Cup generated interest in Asia, contributing to rugby’s fast-growing global reach. However, concerns over player welfare, particularly regarding concussion management, have prompted changes in the laws of the game once more, highlighting rugby’s ongoing evolution.
]]>Two handwritten messages, written almost a year apart on the same piece of paper. The first, dated May 1847, ends in high spirits: ‘Sir John Franklin commanding the Expedition. All Well.’ The second message is a stiff scribble in the margins, added the following April. It tells of the death of 24 men, including Franklin, the abandonment of his expedition’s ships HMS Erebus and Terror, and a desperate plan to trek overland to safety.
Found in 1859 inside a cairn on King William Island, the Victory Point Record is one of the most evocative documents in the history of Arctic exploration. It is crucial evidence in unpicking the mystery of what happened to the Royal Navy’s failed 1845 attempt to chart the Northwest Passage through the Arctic, from which 129 men never returned.
What happened?
In 1845, the aptly-named HMS Terror along with the HMS Erebus set off from Britain towards what is now Nunavut in Northern Canada in a quest to discover the fabled Northwest Passage – a navigable Arctic route linking the Atlantic Ocean to the Pacific. Its discovery held the promise of global trade, and heroism for the crew.
Having been sought for at least a century, previous explorations of the Arctic coastline had led to optimism that the Northwest Passage’s discovery was within reach. Britain was keen to be the nation that found, and thus controlled, the passage.
The expedition took place during the early years of Queen Victoria’s reign, when Britain presided over the beginnings of an Empire that would become immense. Not only tasked with a mission of immense geographical significance, the crew also carried the weight of British hopes and ambitions – their success in finding the Northwest Passage would bring further glory, trade and wealth to the nation.
However, as the expedition unfolded, the once-optimistic mission turned into a harrowing nightmare. The strict rules that governed life on board Royal Navy vessels disintegrated, leading to what would ultimately become the worst disaster to hit Britain’s Royal Navy in its history of polar exploration.
The expedition was commanded by Captain Sir John Franklin, a seasoned polar explorer who had already led two prior attempts to find the Northwest Passage, and was keen to claim his prize. Under his command were 129 men. Franklin, a respected and well-liked figure, had served at the Battle of Trafalgar and was a household name in Britain. His quest to claim the Northwest Passage thus garnered significant public attention.
Francis Crozier, second in command, was a skilled sailor, yet had faced challenges in his naval career as an Irishman. At the time of the expedition, the Great Irish Famine was beginning, and Crozier, hailing from County Down, saw this mission as an opportunity to make his name.
Left: Captain Sir John Franklin in 1845 (expedition leader and commander of HMS Erebus); Right: Captain Francis Crozier, taken on 16 May 1845 on H.M.S. Erebus at Greenhithe, Kent (commander of HMS Terror)
Image Credit: Left: Wikimedia Commons / Maritime Greenwich Souvenir Guide, London 1993 / Public Domain. Right: Wikimedia Commons / Francis Rawdon Moira Crozier / Public Domain
Both ships, HMS Terror and Erebus, were relatively old, having been in service since 1812 as bomb ships. They had sailed all over the world, and whilst not specifically designed for polar expeditions, had been to polar regions before, and were considered powerful and luxurious for their time, with heating systems and space for vast food supplies. Many of the expedition’s crew, including Franklin, had served on these exact ships before – emblems of military might, colonial prowess, and imperial expansion.
The ships carried a substantial supply of provisions, including 3 years’ worth of tinned food, as well as livestock, 7,000 lb of tobacco, 2,700 lbs of candles, a cat, a dog called Neptune, and even a monkey gifted by Franklin’s wife.
The ships made stops in Scotland’s Orkney Islands and Greenland before heading to Arctic Canada, and early letters written home describe Franklin as being a great commander.
As they ventured north, HMS Terror and Erebus were last seen by a whaler in Baffin Bay, waiting for ice to clear in the Lancaster Sound.
Soon they entered remote territory where Inuit rarely visited, and as winter set in, the ships froze in pack ice near Beechy Island. Expeditions were experienced at such events, yet unmoving, the crew battled boredom and hardship in the darkness. Conditions were severe, with the men at constant risk of hypothermia, frostbite, and other cold-related injuries. Temperatures could drop to -35C by day, and reach -48C at night, where sweat turned to ice.
Any naval ship would have been a concentrated microscopic version of society at home, and inevitably the intrinsic hierarchies that were a way of regulating the crew’s behaviour led to class tensions on board, exacerbated by the extreme conditions.
In spring, the ships sailed south down Peel Sound, but were soon trapped again by ice near King William Island near the McClintock Channel. In spring 1847, a group of crew members travelled across the ice to Point Victory and left a written record of their expedition.
The ships continued to drift south down the Victoria Strait, marking their last recorded position just northwest of King William Island. In April 1848, Captain Crozier, who by now had taken command, ordered the crew to abandon the ships.
What had started as a great royal naval expedition had turned to disaster.
Map of the probable routes taken by HMS Erebus and HMS Terror during Franklin’s lost expedition
Image Credit: Wikimedia Commons / Hans van der Maarel / CC BY-SA 4.0 / Public Domain
Of the 105 men who set out across the ice under Captain Crozier, none would survive the march south. Weakened by starvation, scurvy, pneumonia, tuberculosis, and lead poisoning, they split off into smaller groups as supplies dwindled.
Back in Britain, Lady Franklin had become increasingly concerned. Knowing the expedition carried only 3 years’ worth of supplies, by 1847, she petitioned for a search party, even reaching out to the Tsar of Russia and US President. After two years without communication, the Admiralty sent out a search party but without success.
It wasn’t until 1854 that the first search parties reached the Arctic. What they found was shocking.
The exact circumstances of the men’s deaths remain a mystery, but among the scattered remains of the crew were mutilated body parts, some hacked with knives, and others placed in cooking pots. It appears that for around 30 of the crew, cannibalism had become a last, miserable resort.
One of the search party, Dr John Rae, reported his findings, having found artefacts and gathering word from local Inuit, writing:
From the mutilated state of many of the bodies and the contents of the kettles, it is evident that our wretched countrymen had been driven to the last dread alternative as a means of sustaining life. A few of the unfortunate men must have survived until the arrival of the wild fowl, say until the end of May as shots were heard and fresh bones and feathers of geese were noticed near the scene of the sad event.
Bodies were found underneath an upturned rowing boat, indicating an attempt at shelter. Due to the conditions, the men were found preserved; all had their clothes still on, along with objects brought from the ship. Their tent was still standing, and some bodies were found near a former campfire. Bodies from earlier parts of expedition had been buried, and when exhumed, were found to be almost completely preserved.
The discovery was shocking. Initially Rae’s integrity was called into question, especially by Lady Franklin, yet skeletons found by later search parties confirmed Rae’s conclusions, proving catastrophic for the Royal Navy’s reputation.
For a crew to abandon a ship, something had gone significantly wrong. Unfortunately there are no detailed written records, as most documentation from the ships has been lost. The sole surviving piece of information is the Victory Point Note – discovered in 1859 a search expedition sent by Lady Franklin.
The note, left in a stone cairn built by a previous expedition in 1831, was used twice on two separate dates. The first, dated 28 May 1847, initially appears positive, with ‘all well’ underlined several times. Potentially a few people would have already died by this point, but this was somewhat to be expected on such an expedition with threats of disease and scurvy. Interestingly Captain Franklin didn’t sign the paper himself, but was still in charge.
However, by the second entry on 25 April 1848, it’s clear a dire situation had unfolded. Written in the margins around the form, in a more scrawled hand, the note explains the Erebus and Terror had been abandoned for 3 days, having been stuck in the ice since 12 September 1846. Captain Franklin had died on 11 June 1847, along with a further 9 officers and 15 men. The survivors intended to walk to a remote fur-trading outpost, hundreds of miles away.
The “Victory Point” note – found by Francis Leopold McClintock’s Expedition team in a cairn on King William Island in 1859, detailing the fate of the Franklin Expedition (written on a standard Admiralty form)
The decision to abandon the ships would have been a last resort. Understandably causing great fear, this was undoubtedly a heartbreaking decision for the crew.
In 1981, forensic anthropologist Dr Owen Beattie analysed some of the human remains collected from sites on King William Island using modern forensic techniques, and found high amounts of lead, leading to the theory that lead poisoning may have played a role in the expedition’s tragic end.
Beattie and a specialised team also exhumed and autopsied three exceptionally well-preserved crewmen who had been buried during the expedition’s first winter in the Arctic. Examination of DNA provided further evidence of lead poisoning, likely by contamination via lead solder used to seal the expedition’s tinned food.
Numerous expeditions were launched to locate the shipwrecks, and finally, in 2014, HMS Erebus was discovered off King William Island. Two years later, in 2016, HMS Terror was found in a bay 45 miles away off the coast of King William Island, in Canada’s aptly-named Terror Bay. Both wrecks were far south of where they were initially abandoned.
HMS Terror’s wreck is exceptionally well-preserved, with crockery, glasses, furniture, and scientific instruments still in their original positions. The Parks Canada team also found sediment covering the ship which, along with cold water and darkness, created an anaerobic environment ideal for preserving delicate items like textiles and paper. This means journals, charts, maps and photographs still on board could all potentially be preserved and salvageable.
Chinaware from the HMS Terror. On display at the Canadian Museum of History.
There is also a long, heavy rope line running through a hole in the deck suggesting an anchor line was deployed before HMS Terror sank. Given the location of the find, this has led to a theory that the remaining crew may have closed down HMS Terror and re-boarded HMS Erebus, in a desperate attempt to escape south.
Underwater archaeologists from Parks Canada in collaboration with the Inuit Heritage Trust now have joint control of those sites, and are recovering artefacts from the wrecks, many of which are now on display in museums. Although evidence of what happened is scattered and subject to changing landscapes, further melting of sea ice through climate change may yet reveal more of this fateful expedition in future.
]]>
Pasteur’s work spanned multiple areas, laying the foundation for the germ theory of disease, sterilisation techniques, the development of vaccination, and the concept of pasteurisation – revolutionising the understanding and practice of medicine and leaving a lasting legacy that continues to shape modern medicine and human health.
Here we explore more about the impact Pasteur and his work have had.
Louis Pasteur was born on 27 December 1822, in Dole, in eastern France. From an early age, he demonstrated a keen interest in chemistry and pursued his education in the field at the École Normale Supérieure in Paris. He excelled in his studies, earning his master’s degree in 1845 and his doctorate in 1847, and became an assistant professor of chemistry there in 1848.
Pasteur’s initial research focused on crystals and isomerism, but changed focus to the world of microbiology when he was appointed as the dean of the Faculty of Sciences at the University of Lille in 1854.
Pasteur’s exploration into microbiology began with research into fermentation. He observed that fermentation was not a spontaneous chemical process but rather the result of the action of living microorganisms, specifically yeast cells. This insight contradicted the prevailing theory of spontaneous generation, which posited that life could arise spontaneously from non-living matter.
Louis Pasteur in 1857
In 1857 Pasteur returned to the École Normale as Director of Scientific Studies, and by 1859, Pasteur’s experiments definitively disproved the theory of spontaneous generation, paving the way for a new understanding of the role of microorganisms in biological processes.
One of Pasteur’s most significant contributions to modern medicine was the development of the germ theory of disease. Building on his work in fermentation, he expanded his research to investigate the causes of infectious diseases in humans and animals, including a devastating blight that had befallen the silkworms that were the basis for France’s then-important silk industry.
At the time, prevailing beliefs attributed diseases to “miasma” or “bad air,” and there was little understanding of the role of microorganisms in causing illnesses. Through rigorous and meticulous experimentation and observations, Pasteur conclusively demonstrated that microorganisms, such as bacteria and viruses, were responsible for causing various diseases. This groundbreaking revelation fundamentally changed the way diseases were understood and treated, shifting the focus from the symptoms to the underlying causes.
In 1865, Pasteur presented his germ theory to the French Academy of Sciences. His theory revolutionised the understanding of disease causation, laying the groundwork for the development of modern infectious disease control and the importance of sanitation and hygiene in disease prevention.
Louis Pasteur experimenting on bacteria, c1870
Louis Pasteur, one of the biggest figures in immunology, was investigating chicken cholera when he discovered attenuation, the process of weakening a strain of bacteria over time. He accidentally successfully vaccinated chickens against chicken cholera in 1879, sparking a whole new dimension of thought in immunology.
Building upon Edward Jenner’s earlier work with cowpox and smallpox, one of Pasteur’s most celebrated achievements came in 1885 when he successfully developed the first rabies vaccine.
After a boy named Joseph Meister had been bitten by a rabid dog and was brought to Pasteur, Pasteur took a daring approach, vaccinating him with a weakened strain of the rabies virus that had been cultivated in rabbits. To the astonishment of the medical community, the boy survived without contracting the disease.
This groundbreaking success led to widespread recognition of Pasteur’s work, solidifying his reputation as a leading figure in the field of immunology. Pasteur’s rabies vaccine (and his development of the first anthrax vaccine) also became a turning point in preventive medicine. His success in producing weakened strains of these pathogens inspired further research into vaccination, opening up new possibilities for controlling, preventing and eradicating deadly infectious diseases.
Vaccination has since become a cornerstone of public health, preventing countless illnesses and saving millions of lives worldwide.
In the mid-19th century, the spoilage of wine and beer due to microbial contamination was a widespread problem. Pasteur’s earlier work on fermentation and ‘diseases’ of wine in his laboratory led him to realise that these were caused by unwanted microorganisms, that could be destroyed by heating wine to a temperature between 60° and 100°C.
Aware of people’s natural reluctance in embracing new technology, Pasteur often carried out public demonstrations to convince sceptics. In one such experiment, a batch of wine was heated and then sent to sea along with some unheated wine on a vessel named Jean Bart as an experiment. On its return 10 months later, the heated wine was fine whereas the unheated wine tasted almost acidic.
Pasteur also personally supervised the French Navy frigate La Sibylle to carry a complete cargo of heated wine on a circumnavigation of the world that arrived back ‘unspoilt’. This helped enable the protection of the French wine and beer industries (pasteurised wine is rare today). Pasteur’s experiments demonstrated that heating liquids, such as wine and milk, to a specific temperature could kill harmful microorganisms and bacteria while preserving the taste, quality and nutritional value of the product.
In 1864, Pasteur presented his findings to the French Academy of Sciences. His simple yet effective process of pasteurisation – which he patented in 1865 and is named after him – revolutionised food and beverage preservation, reducing the risk of food-borne illnesses and ensuring a safer food supply.
The process was later extended to all sorts of other spoilable substances, including milk in 1886 by German chemist, Franz von Soxhlet – indeed pasteurisation was used for most commercial milk by the late 1920s. Today, pasteurisation remains a standard practice in the food industry, ensuring the safety and longevity of various perishable products.
Pasteur’s germ theory also had profound implications for advancing surgical practices.
Prior to Pasteur’s work, surgical procedures were often performed in unsterilised environments, leading to high rates of infections and post-operative complications. Understanding that microorganisms could cause wound infections and transmit disease, Pasteur advocated for antiseptic practices to reduce surgical complications.
British surgeon Joseph Lister was greatly inspired by Pasteur’s work, introducing aseptic techniques into surgical procedures. Lister’s use of antiseptics to sterilise surgical instruments and wound dressings significantly reduced infection rates and improved patient outcomes, transforming the field of surgery and promoting a major shift in surgical practices.
Pasteur’s influence on surgical techniques and infection control has saved innumerable lives and laid the groundwork for modern aseptic surgical procedures.
Portrait of Louis Pasteur, by Albert Edelfelt (1854–1905) (cropped)
Image Credit: Wikimedia Commons / Musée d'Orsay / Public Domain
Despite his lack of a medical degree, Pasteur’s work was internationally recognised, and his influence was far-reaching. He received numerous accolades during his lifetime, including his election to the French Academy of Sciences and being awarded the Copley Medal from the Royal Society of London.
Pasteur personally initiated The Pasteur Institute in Paris, established in 1888, which became a centre of scientific excellence and research in microbiology and immunology. It remains a prominent research institution, continuing Pasteur’s mission of advancing medical knowledge and finding solutions to public health challenges.
Whilst Pasteur is said to have downplayed assistance from colleagues and not reported his findings in an appropriate or timely manner, his accomplishments have been summarised by many via his famous maxim, “chance favours only the prepared mind”.
Nevertheless, beyond his scientific achievements, Pasteur’s dedication to rigorous scientific methods, rigorous experimentation and intellectual curiosity has inspired generations of scientists to push the boundaries of medical research. His emphasis on evidence-based medicine has become a cornerstone of modern medical practice, ensuring that treatments and interventions are based on solid scientific evidence.
]]>Liston was committed to improving surgical practices with his innovative approaches – he was the first surgeon in Europe to operate with patients under ether anaesthesia, and was known for his emphasis on patient welfare and education that set him apart from his peers. His contributions continue to influence modern surgical techniques today.
Here we explore the life of Robert Liston, and his legacy on the medical profession.
Robert Liston was born on 28 October 1794, in West Lothian, Scotland. His mother died when he was 6, so he was raised and taught by his father. Liston’s interest in medicine emerged at an early age, and he entered the Edinburgh University’s Medical School in 1808 aged 14, studying under some of the most prominent medical practitioners of the time. He showed exceptional talent and dedication to his studies.
In 1810 Liston began his medical training as assistant to famed anatomist, Dr John Barclay. In 1814, he was appointed house surgeon at the Royal Infirmary of Edinburgh, and was admitted to the Royal College of Surgeons in London two years later aged 22.
Liston had already gained a reputation for being argumentative, straight-talking and intimidating, and after a disagreement with Barclay, opened his own anatomy class in 1818. Liston became known as a fearless surgeon, willing to operate on patients other surgeons deemed untreatable. His criticism of surgeons he did not respect and charges that he was enticing patients towards his own practice briefly led to Liston being dismissed by the Infirmary, but he was later reinstated.
Nevertheless, Liston was later appointed Professor of Surgery as the University College Hospital in London aged 34, and remained working there for the rest of his life.
In the early 19th century, before the development of anaesthesia, amputations were performed on fully conscious patients who were able to feel the excruciating pain of the surgery. Speed was crucial, and thus surgeons who removed limbs the quickest gained popularity. A skilled surgeon could amputate a leg in under three minutes.
During the procedure, a screaming patient was restrained on a wooden bench by ‘dressers’, who would also assist the surgeon with ligatures, knives and dressings. Although unknown at the time, rapid operations minimised the exposure of tissue to microbes and infection, however many patients still died from shock, blood-loss, and post-operative infection.
In the early 19th century, most surgeons including Liston believed surgery was often a patient’s last resort. When a bone had penetrated the skin, amputation was often the only option, the alternative was gangrene, blood poisoning and death.
Before developments such as blood transfusions, patient survival also depended largely on how quickly the surgeon could manage bleeding.
Liston quickly gained recognition for his surgical skills, precision, speed and efficiency in his operations at University College Hospital in London in the early 1840s. The chances of dying from a Liston amputation were around 1 in 6, significantly better than the average Victorian surgeon.
Liston emphasised the importance of minimising patient pain and reducing the duration of surgical procedures, introducing various techniques to streamline operations, including the use of rubber tubing ligatures to control bleeding, instead of traditional silk ligatures.
Medical students and visiting surgeons would pack Liston’s surgical theatre, eager to witness his techniques. Liston would often nod to his audience saying “Time me gentlemen, time me!”. His above-the-knee amputations from incision to final suture were completed in less than 30 seconds, earning him the nickname, ‘The Fastest Knife in the West End’.
Whilst Liston was celebrated for his speed, it sometimes led to unfortunate accidents. On one occasion, Liston accidentally removed a patient’s testicles along with the leg being amputated due to the swiftness of his work.
Additionally, Liston holds the distinction of being the only surgeon said to have performed an operation with a 300% mortality rate. During a high-speed amputation, he inadvertently severed his assistant’s fingers and slashed a spectator’s coat, causing the shocked spectator to faint. All three later died – the patient and assistant from sepsis, and the spectator from shock.
Scottish surgeon Robert Liston. (Left: photograph circa 1845 by Hill & Adamson)
Although known to clench a knife in his teeth while his hands were occupied, Liston had a strong sense of cleanliness and order in his surgical procedures. He was one of the few surgeons known to wash his hands before operations, long before the formulation of germ theory and antiseptics. He also always wore a clean apron for each operation – uncommon for the time, as most surgeons wore the same filthy apron as evidence of their abilities and experience.
Liston also shaved surgical sites prior to incision and insisted on clean surgical sponges, and cold water-soaked dressings only.
While many believed the pain of surgery acted like a stimulant and was beneficial for healing, Liston was an early advocate of anaesthesia. He performed Europe’s first operation using anaesthesia on 21 December 1846. His patient, Frederick Churchill, caused amusement in the operating theatre when he woke after the amputation asking Liston when he was going to begin. Nevertheless, the combination of Victorian gas lighting and the anaesthetic gas was potentially lethal.
Liston also made significant contributions to surgical techniques, revolutionising amputations by using a long straight knife with sharpened edges, known as the ‘Liston knife’, which became a standard amputation tool. He also invented forceps with a built-in snap to control arterial bleeding.
Liston recognised patient fears regarding surgery, and believed in the importance of informing and reassuring them, paying great attention to post-operative care too. He also challenged notions that experience and ability were solely based on age, arguing that volume of cases determined experience, and that surgeons should take courageous actions.
Set of four Liston-type amputation knives
Image Credit: Wikimedia Commons / Wellcome Images / Photo number: L0057237 / CC BY 4.0
Liston was tough and demanding. Standing at 6 foot 2 inches tall, his height contributed to the speed and physicality of his surgical operations – and his imposing presence. While he could be harsh towards trainees who didn’t meet his standards during operations, he was rumoured to be kind outside of work. Liston strongly advocated for the humane treatment of patients, emphasising the importance of treating them with dignity, obtaining their consent, and displaying empathy.
Liston also firmly opposed practices that he deemed unethical, objectionable and unscientific, and confronted surgeon Robert Knox, believed to have been involved in the infamous William Burke and William Hare serial murders and grave robberies from 1827-1828 that supplied bodies for dissection by anatomy students.
Robert Liston died in 1847 aged 53 of a ruptured aortic aneurysm. His funeral was attended by over 500 students, friends and pupils, and he was buried at Highgate Cemetery.
Whilst a highly respected surgeon, Liston was not immune to criticism. Some questioned his focus on speed, suggesting it compromised patient safety and thoroughness, and his brash and confrontational personality sometimes led to strained relationships with colleagues.
However, Robert Liston’s contributions to the medical profession were significant. His use of flaps in amputations had a substantive impact on surgical technique, as did his innovations in commonplace instruments such as his amputation knife, locking vascular forceps, and long splint. He also incorporated hygienic practices that foreshadowed the concept of antiseptic, sterile operating environments.
Liston understood patient fears about surgery and was a dedicated educator who imparted his knowledge and techniques to his students, holding them to high standards while fostering an environment for continual improvement which continues to influence modern surgical techniques today.
]]>