The prostitute whose pox inspired feminists

Fitzrovia, 1875. A woman recorded only as A.G. enters hospital and is diagnosed with syphilis.

L0074273 Tab 18, Papulopustular Syphilide, Mracek, 1898

A 24-year-old woman suffering from syphilis.

On 3 November 1875 a 19-year-old girl recorded only as A.G. was admitted to the Central London Sick Asylum in what is now Fitzrovia. As A.G. lay in bed, limbs heavy and eyelids drooping, a throbbing pain seeped down her back. Her entire body was covered with small rose-coloured spots, physical signs that marked her out as both a sufferer of syphilis and a disreputable young woman.

A.G.’s medical notes identify her as a prostitute. About eight weeks before, she may have noticed a small pimple appear. When it grew to the size of a pea, rupturing to form an ulcer, it was the first sign she had contracted a condition the medical establishment claimed to be generated within the bodies of women. Official efforts to contain the disease in the 1800s focused on controlling women – especially women like A.G. These tactics were to have unintended consequences, as they sparked the emergence of the first wave of feminism.

Syphilis was first associated with prostitution and supposed depraved behaviour soon after it appeared in Europe at the end of the 1400s. Originally known as ‘the Pox’, the name syphilis is derived from the title character of a Latin poem of 1530 in which a sinner is punished for betraying the god Jupiter. When it became apparent that it could be transmitted through sexual contact, it was interpreted as divine punishment for promiscuity.

When A.G. started working as a prostitute, ‘fallen women’ were thought to have a high risk of contracting syphilis not – as might be expected – due to their increased chance of being exposed to infection, but because of their inherent immorality. If the disease was a direct result of promiscuous intercourse, prostitutes were nothing less than a festering sore on society. Like plague-infected rats or cholera-swamped sewers, women who made their living selling sex were a problem that had to be monitored and improved.

“[The prostitute] is a woman with half the woman gone, and that half containing all that elevates her nature, leaving her a mere instrument of impurity… a social pest, carrying contamination and foulness to every quarter to which she has access.”
William Acton, Prostitution, Considered in its Moral, Social, and Sanitary Aspects, 1857

V0042209 Journey to eternity; prostitute approaches a man and his son

Syphilis depicted as a skeleton masquerading as a prostitute, 1830

The Contagious Diseases Acts, which were first implemented in England and Ireland a decade before, provided a legal framework for keeping track of prostitutes and isolating those who were infected. Despite the generic title, the Acts were designed specifically to reduce the impact of syphilis and gonorrhoea on men serving in the military. Their reach was limited to a small number of ports or garrison towns, where plain-clothes police officers were empowered to stop any woman they had “good cause” to think might be a prostitute.

These women were then requested to submit to an internal medical examination. The inspection was described as “voluntary”, but police were known to coerce illiterate and underage women into agreeing to it. Later versions of the law made fortnightly inspection compulsory once a woman’s name had been added to an official register. Now identified as apart from – or even something less than – an ordinary woman, if she didn’t comply she could be jailed.

L0031631 A surgeon or gynaecologist examining a woman with a vaginal

A 19th-century drawing of a woman being inspected with a speculum.

“It must be acknowledged, in fact, that by this means alone can we hope to reduce the ever-growing number of cases of syphilis in women whose vice and poverty has set them outside society.”
J-B Venot, Aperçu de statistique médicale et administrative, 1837

Initially, examinations were undertaken using a cold, metal speculum, a device that enables a doctor to open up a woman’s vagina and look inside. The leading syphilis specialist of the era considered the speculum to be an indispensable “instrument of medical control”; prostitutes in France called the device “the government’s penis”, while British campaigners against the Contagious Diseases Acts used the term “instrument of rape”.

“…often they use several. They seem to tear the passage open first with their hands, and examine us, and then they thrust in instruments, and they pull them out and push them in, and they turn and twist them about.”
A woman registered as a prostitute, quoted in The Forcible Introspection, 1870

Gallery: speculums

If found to be infected with syphilis, women were detained in isolation wards or specialist hospitals, effectively quarantining them from the men they were thought to be polluting. From the 1860s onwards, some of these ‘lock hospitals’ also attempted to treat inmates’ moral lapses, hoping to prevent their return to prostitution after release.

A central principle of the Contagious Diseases Acts was the belief that syphilis arose in the bodies of women, especially those of immoral character. Syphilis and other venereal diseases were for many years personified as women who, out to tempt their male victims, should be avoided.

Gallery: twentieth-century depictions of syphilis and other venereal diseases

Yet anyone could see that prostitutes – or even ordinary women – weren’t the only carriers of syphilis. On the same day A.G. arrived at hospital, an educated, married woman in her late 30s was also admitted. Identified by the initials A.P., she struggled to hold a cup of tea, spoke in a “deranged” fashion and endured a persistent ache in her head. Seven years previously she had developed a large coin-shaped sore on her tongue after catching syphilis from her husband.

V0010166 Woman with diseased tongue and broken teeth, 1874.

An out-patient at London’s Royal Free Hospital, 1874.

Women like A.P., who were likely celibate before marriage, could hardly be accused of being a source of disease or of indulging in immoral acts. Yet the possibility that men could act as carriers was largely overlooked. The soldiers that the Contagious Diseases Acts were intended to protect were not monitored in the same way as local women; well-to-do men were not considered a threat, since even the most dissolute philanderer was not expected to infect as many people as a diseased prostitute. However, these established views were questioned when medicine turned its attention towards the issue of congenital, or hereditary, syphilis.

Syphilitic pregnant women had a high chance of miscarriage and A.P. lost several babies after she became infected. Even when a pregnancy reached full term, many ‘innocent’ victims of syphilis died days after birth. Those that survived lived with the stigma of physical deformities, which suggested they might be the product of an immoral liaison. Some also had mental abnormalities that hindered their development. Since some syphilitic babies were born to mothers with no visible symptoms, doctors in the 1800s started to consider whether men could be responsible for transmitting the disease to their children.

Gallery: congenital or hereditary syphilis

The situations of women like A.G. and A.P. led early feminist campaigners to focus on this issue of male responsibility. In 1869 social reformer Harriet Martineau launched the Ladies National Association with the support of Florence Nightingale. The organisation campaigned for the repeal of the Contagious Diseases Acts, which its charismatic secretary Josephine Butler said contravened the constitutional rights of women.

“It is unjust to punish the sex who are the victims of a vice, and leave unpunished the sex who are the main cause, both of the vice and its dreaded consequences.”
The Ladies’ Appeal and Protest, 1870

The Ladies National Association directly challenged mainstream medicine and its subordination of women. Contrary to published medical statistics, it claimed the Acts had failed to control the spread of syphilis and it viewed prostitution not as a public health problem but as the logical result of a society that legitimised men’s sexual privilege.

“It is coming to be more and more a deadly fight on the part of us women for our bodies. If these doctors could be forced to keep their hateful hands off us, there would be an end to laws which protect vice, and to many other evils.”
Josephine Butler, writing in 1872

At the time, the fact that women – and middle-class Christian women at that – spoke publicly on sexual issues caused a sensation. Yet, while they fought for the rights of women like A.G., these early feminists still portrayed women we would now describe as sex workers as unfortunate victims, outcasts that should be “kept apart” from pure, respectable ladies. The campaigners’ ultimate aim was, in reality, two-fold: to nullify a law that applied “to one sex only” and to eradicate the “soul-devouring evil” of prostitution.

L0049215 A well-dressed client inspects the prostitutes at a brothel

Illustration from a French book describing prostitution, 1884.

“The cause of sexual disease is the subjection of women. Therefore to destroy the one we must destroy the other.”
Christabel Pankhurst, ‘A woman’s question’, published in The Suffragette, 1913

The Contagious Diseases Acts were finally repealed ten years after A.G. arrived at the Central London Sick Asylum, but the question of the true cause of syphilis and the fight for women’s rights still raged on almost four decades later. One of the leaders of the suffragette movement, Christabel Pankhurst, believed the “great evil” of sexual disease could only be addressed if women gained greater independence and men observed the same moral standards as virtuous women. Her stance was summed up in the campaign slogan “Votes for Women and Chastity for Men”.

Neither A.G. nor A.P. would live to see the repeal of the Contagious Diseases Acts or gain the right to vote. A.G. died just a week after admission. Her post mortem showed that parts of the membrane around her brain had turned to jelly. Within a few weeks her fellow patient lost the ability to speak. Three months later she died of the same cause as A.G., her education and respectability notwithstanding.

The child whose town rejected vaccines

Gloucester, 1896. Ethel Cromwell is taken ill at the height of Britain’s last great smallpox epidemic.

V0031460 Gloucester smallpox epidemic, 1896: Ethel Cromwell

Ethel Cromwell in hospital, 1896.

In the spring of 1896 Ethel Cromwell lay, covered in infectious blisters, in a Gloucester hospital. The photographer who took her picture recorded Cromwell’s name, her age (“about 14 years”), her date of admission and one other crucial fact: that she had never been vaccinated against smallpox, the disease now causing her so much pain.

Cromwell was one of at least 10,000 local children thought to be unvaccinated when Britain’s last great smallpox epidemic hit the West Country city of Gloucester. Each of their parents or guardians had acted outside the law, risking the health of their children and that of the wider population.

In mid-January, a few days before Cromwell arrived at the hospital, the fever, headaches and nausea would have started. She may have had a searing pain in her back and her throat was probably raw with sores. As soon as a rash of flat red spots appeared, smallpox would have been diagnosed and Cromwell would have been sent to hospital. There her rash would have developed into blisters filled with yellow pus, which would have stretched her skin so tight it ached. Back at home, Cromwell’s clothes and bedding would have been burnt, her rooms disinfected and whoever she lived with quarantined.

As the first spots broke out on Cromwell’s face, body and arms, Dr John Campbell, the Gloucester Medical Officer of Health, realised the outbreak, which had begun the previous summer, was now progressing “in an alarming manner”. In order to contain it, Campbell advised the local sanitary authority to recruit a team of people to travel from house to house providing vaccinations.

His recommendation wasn’t implemented at that stage. Instead, the formerly indifferent Gloucester Board of Guardians – who oversaw relief given to the poor – issued an announcement recommending parents obtain immediate vaccinations for their children. Around the same time, lecturers indulging in what Campbell termed “intentional untruthfulness” travelled the country claiming the epidemic was caused by insanitary conditions. These speakers formed part of a mass political movement that opposed vaccination, in particular the state’s decision, 40 years before, to make it compulsory.

“The anti-vaccinators have not been idle, and have caused it to be rumoured far and near that the true cause of the epidemic was the unhealthy condition of the city, and not the want of vaccination. These, I find, are the tactics they adopt everywhere, a red herring, I suppose, to divert the scent from the true cause.”
John Campbell, Medical Officer of Health for Gloucester City and Port, 1897

Prior to the 1800s the only way to control the spread of smallpox was through a practice known as variolation. First used in China, this early form of inoculation took fluid or scabs from someone infected with smallpox and introduced them into a healthy person, either by inhaling the material or by rubbing it into a cut on the arm or leg. If all went well, the recipient caught a mild version of the disease and therefore became immune to later infection. However, some people died from the procedure or from other infections picked up during it. Anyone deliberately infected with the disease also had the obvious potential to spread smallpox to others.

L0017918 Figures showing vaccination pustules.

Illustration of pustules used for inoculation in China, 1913.

In 1796, just 30 kilometres from Gloucester, Dr Edward Jenner experimented with a safer version of variolation based on the observation that dairymaids who had suffered from the much milder cowpox appeared to be protected from smallpox. Jenner transferred material from a woman infected with cowpox into an eight-year-old boy, a process he called ‘vaccination’ after the Latin word for cow: vacca. Jenner wasn’t the first person to successfully vaccinate using cowpox, but his work conferred scientific status on the procedure and led to its widespread adoption.

By 1853 smallpox vaccination was a legal requirement for newborns in England and Wales, yet there were few consequences for anyone who avoided it. It was 20 years before the first prosecutions were brought, when anyone found guilty could be penalised by a fine or prison term. This move transformed an issue of personal medical preference into a question of civil liberties. It also challenged the long-held rights of individual councils and communities to make decisions based on the specific circumstances of their own areas. In response, the world’s first organised anti-vaccination societies, publications and rallies sprung up.

“The State has no right to encroach upon parental responsibility, or to impose either religious or medical dogmas upon the people of this country on any pretence whatever.”
William Tebb, President of the London Society for the Abolition of Compulsory Vaccination, 1887

There is no record of whether anti-vaccinators directly influenced Cromwell’s parents or guardians. Their decision couldn’t have been due to cost: vaccination had been freely available at public vaccination stations since the early 1800s. Perhaps they didn’t see the need for it or didn’t want to put their child through what was, by today’s standards, still an unpleasant and potentially dangerous procedure. Or perhaps they resisted on moral or religious grounds. For whatever reasons, the prevailing mood in Gloucester had turned against vaccination at least a decade before.

Local MPs and officials on the Board of Guardians were elected on anti-vaccination tickets and, in 1887, the board voted to “take no further steps in vaccination prosecutions”, effectively removing any compulsion to agree to it. As a result, the year before the 1896 epidemic the Vaccination Inquirer described Gloucester as the least vaccinated city in the country, with 83 per cent of the population failing to comply with the law.

Individual reasons for objecting to vaccination were, as they still are today, diverse. Some felt any technological intervention against a common disease was unnatural. Smallpox, after all, had been a fact of life for centuries. It struck all sectors of society, even killing Queen Mary II, her brother, uncles and nieces. For many, attempting to control something so embedded in the experience of life (and death) was an act against God’s divine plan.

Other non-vaccinators felt the process went against the laws of logic – how could introducing infectious material into the body ever be good for you? Even people who embraced the latest scientific thinking – Charles Darwin’s theory of evolution by natural selection – felt vaccination was flying in the face of nature, since it used material taken from a lower form of life (the cow).

This issue of intermingling the bodily material of a beast and a human prompted some of the most emotional reactions against vaccination, as people feared the process might cause them or their children to develop brutish, cow-like appendages or habits.

Gallery: anti-vaccination illustrations

Whatever the reasons for so many people in Gloucester defying the law, their decisions enabled the disease to spread quickly through unvaccinated children at school. Fifty other cases were notified during the month Cromwell arrived in hospital; four weeks later there were no free hospital beds. By the end of April the number of monthly diagnoses had risen to almost 900.

As the severity of the outbreak became apparent, many parents took their children to free vaccination stations. The Board of Guardians, who now viewed vaccination as a way of controlling the epidemic, finally formed a committee to oversee house-to-house vaccination visits and instructed employers to get their staff – many of whom may have received vaccinations as babies – re-vaccinated.

Posters and handbills distributed to households the day before, and on the morning of, the vaccinators’ visits appealed to individuals’ moral duty, asking anyone who had previously opposed vaccination to consider “the grave responsibility” they were incurring and imploring them to “follow the good example already set by so many… who have submitted both themselves and their families to the operation for the public good”. Unlike pro-vaccination campaigns in later years, the wording emphasised the benefit to the wider public rather than the impact on individual health.

Gallery: pro-vaccination campaigns

Six months after Ethel Cromwell contracted the disease, no new cases were being reported in Gloucester. The mass vaccination project had brought the outbreak under control, but not before 1,981 people had been infected. Two-thirds of these were children under ten. The handful of these who had been vaccinated all survived. Of the unvaccinated, 40 per cent died. It was a similar story in other segments of the population.

TO-05-16-DataViz

Percentage of smallpox cases in the Gloucester epidemic that resulted in death.

Gloucester residents faced with a severe outbreak embraced vaccination at the time, in something akin to a religious conversion, but the level of support for compulsory vaccination at birth hardly changed. Just two years later a new law allowed parents to opt out of vaccination based on their conscience. Conscientious objectors had, initially, to convince a magistrate that they believed the vaccine to be unsafe or ineffective. Of course, one’s conscience, as the National Anti-Vaccination League pointed out years before, is not something that can be assessed by anyone but the individual. It cannot be evaluated in any judicial – or scientific – way.

V0031461 Gloucester smallpox epidemic, 1896: Ethel Cromwell

Ethel Cromwell convalescing, 1896.

By April Ethel Cromwell’s blisters had dried up, her infectious scabs dropped away and she was discharged from hospital. Though lucky not to have died or lost her sight, she would bear smallpox’s characteristic pockmarks for the rest of her life. When Gloucester experienced another epidemic 30 years later, she and other survivors of the 1896 outbreak would, at least, have been immune – a position they could have been in much earlier if their parents had consented to vaccination.

The colonist who faced the blue terror

India, 1857. In a British enclave, Katherine Bartrum watches her friend, and then her family, succumb to the deadly cholera.

L0074303 Young girl suffering from cholera.

A girl suffering from cholera.

At 3pm on 29 June 1857, 23-year-old Katherine Bartrum, an Englishwoman living within the fortified walls of a British complex in the north Indian city of Lucknow, watched as her friend was taken ill with cholera. It was the disease most feared by British residents in India, but also by their compatriots back home, both for its rapid and horrific onset and for what it came to symbolise.

“There are few diseases which have excited more interest among medical men, or more terror in the mind of the Indian community at large, than the epidemic cholera.”
James Annesley (of the Madras Medical Establishment), Sketches of the Most Prevalent Diseases of India, 1825

During Bartrum’s bedside vigil her friend would have experienced severe diarrhoea, losing litres of fluid. Vomiting and writhing in pain, her thirst would have been unquenchable and her eyes and cheeks may have sunk into her face. Most startlingly, her lips, fingernails and skin would probably have turned an eerie shade of blue.

Within three hours the cold and clammy “dews of death” gathered on the patient’s brow and she lost consciousness. By 8pm, as her now motherless child slept unawares, Bartrum’s friend was already in her coffin. Surprisingly for the wife of a medical officer, this swift demise was Bartrum’s first experience of “death in any shape”. Yet, just a month before, cholera had claimed the life of the Commander-in-Chief of British India. Bartrum was also destined to encounter the disease again in the coming weeks.

V0010485 A young Viennese woman, aged 23, depicted before and after

A 23-year old woman before and after contracting cholera.

Cholera had been known in India for hundreds if not thousands of years, but for centuries it was limited to the Bengal region in the east. The “blue terror” travelled across India – and beyond – as the British expanded their grip on a country that had been under the control of the British East India Company for a century.

As the leading cause of death among British troops in India, cholera earned itself a reputation as an insidious, violent enemy always ready to attack. The British viewed Indians – and their “very loose habits” – as the natural cause of the disease, but the British themselves acted as carriers. Their large-scale troop movements aided cholera’s emergence from Bengal, British soldiers fighting on India’s northern borders introduced the disease to their Afghan and Nepalese opponents, and British troops carried it to the Persian Gulf when they were deployed to Oman.

L0074539 Actual & supposed routes of Cholera from Hindoostan to Europe

Nineteenth-century map showing routes of cholera from India to Europe and North America.

Even civil interventions by the colonial power contributed to cholera’s spread. By the time Bartrum arrived in Lucknow, the country’s first railway and the world’s largest canal – a network of routes spanning over 1,000 km – had opened. Both aided cholera’s expansion across the country.

Many Indians blamed the British for cholera’s spread, albeit for different reasons. Some believed cholera was meted out as divine retribution when the British defiled holy places or slaughtered cows, which are considered sacred in the Hindu religion. Others felt the disease was caused by deities who resented British rule. Since Indians were just as likely to catch cholera as the colonists, this meant the wrath of these gods was also targeted at Indians, who had failed to stand up to the British.

Cholera reached the heart of the British Empire too. When the first of four major cholera epidemics hit Britain in 1831, killing around 30,000 people, this ‘new’ disease sparked increased debate and a frenzy of analysis. For many years opinion was divided between those who believed cholera was spread through contact and those who blamed bad air and/or the effects of soil temperature. In the ten years before Bartrum arrived in Lucknow, efforts to understand the disease led to the publication of over 700 cholera-related books in London alone.

Gallery: studies of cholera outbreaks and causal factors

These studies served a purpose as epidemiological tools, but they also gave credence to politicised social policies. As the science of epidemiology developed, medicine shifted away from analysing the behaviour of individuals to investigating issues related to entire populations. These ranged from the nature of the water supply in specific parts of a city to the characteristics thought to be shared by a particular race. In the eyes of 19th-century Brits, the people of India – who were once viewed as fastidiously clean – were thought to be disorderly and dirty.

“The habits of the natives are such that, unless they are closely watched, they cover the entire neighbouring surface with filth.”
Royal Commission on the Health of the Anglo-Indian Army, 1863

Bartrum’s uninviting house in Lucknow was certainly dirty, but filth was also a fact of life in England. However, the 1848 Public Health Act – prompted by Edwin Chadwick’s report on the sanitary conditions of the labouring classes – now set England apart from India. While the home country was taking steps to bring filth and disease under control, India was viewed as stagnant and lacking self-discipline, like the immature child of the great British parent. In this climate, cholera came to symbolise the aspects of Indian society most feared by Europeans.

“One is no less saddened to see the populace as cruelly decimated by this horrible scourge in Berlin, London, and Paris, which stand at the head of modern civilization, as in the backward nations of the Orient and Northern Europe.”
Gazette médicale de Paris, 1832

Fear was the reason Bartrum herself had come to Lucknow. She had arrived seven weeks before, leaving her husband at another military station after Indian soldiers mutinied and killed civilian Europeans living in the city of Delhi. This event marked the start of India’s First War of Independence, and soon led to a six-month siege of the city where Bartrum had taken refuge. From this point on, to British eyes, India was increasingly a place of barbarism.

V0011353 John Bull defending Britain against the invasion of

John Bull defending Britain against the invasion of cholera, 1832.

 

L0000611 Broadsheet warning about Indian cholera 1831

Poster warning of the “alarming approach” of what was described as “Indian” cholera, produced in London in 1831.

In London, Dr John Snow had argued in 1849 that cholera was caused by swallowing poisonous matter that was transmitted through faeces and contaminated water. However, his views did not gain acceptance until at least a decade later. In the meantime, the British medical establishment maintained the stance that Indians were somehow fundamentally different to Europeans. Though scientific investigations found little evidence that race played any role, Indians were inextricably linked with the cholera they were thought to produce.

“Their ways of living are not ours, and for hygienic reasons… close proximity is not desirable.”
Kate Platt, The Home and Health in India and the Tropical Colonies, 1923

L0006579 Engraving: 'Monster Soup..." by William Heath

Nineteenth-century caricature revealing the microscopic impurities found in London’s drinking water.

Of course, if India and Indians were viewed as irredeemably unsanitary, the British administration could excuse itself from spending time and money trying to improve conditions. Susceptible areas in England were seen as unhealthy and vulnerable until improved; India, on the other hand, was beyond hope. Medical theories – despite the evidence – supported the differing political moods at home and abroad.

In England cholera was an alien invader, a colonist in its own right, occupying both the body and the land. As epidemic followed epidemic, people feared the disease might eventually ‘settle’, taking over the country. At the same time, the British administration in India prioritised the health and comfort of its own troops above all else. The Indians now fighting to eject the British from their homeland had to live in far worse conditions.

Just as these Indian rebels laid siege to Lucknow, Katherine Bartrum’s 17-month-old son Bobbie contracted cholera. Though the doctor told Bartrum her son was dying, she administered “the strongest remedies that could be given to a child” and knelt by his bed all night. By morning the outlook was better: Bobbie “began to revive, sat up, and looked so bright”.

Despite being struck down herself the following day, and discovering two months later that her husband had been killed in action, Bartrum and her son managed to survive until the British withdrew from Lucknow four months later. The pair then travelled to Calcutta and boarded a ship bound for England. The night before it set sail, Bobbie, who had been growing weaker by the day, died.

L0025760 Broadsheet: Cholera and Water, 1866

Poster advising residents of east London not to drink unboiled water during the 1866 cholera epidemic.

When Bartrum arrived back in England, London was in the midst of the ‘Great Stink’, a summer in which the stench of excrement from the Thames became so intolerable that politicians launched a project to develop a citywide sewer system. England experienced its last cholera outbreak eight years later. In London it was localised to an area not yet connected to the new sewage network. But in India millions of people died in later outbreaks. Today cholera remains, as it was before the 1800s, endemic in some areas of the country.

The cook who became a pariah

New York, 1907. Mary Mallon spreads infection, unaware that her name will one day become synonymous with typhoid.

L0062056 Head and neck of a patient suffering from typhoid fever

A typhoid patient, 1882.

In March 1907 a cook named Mary Mallon was visited by a sanitary engineer at her place of work, a swanky townhouse on New York’s Park Avenue. The unexpected caller told Mallon he suspected her of making people sick and requested samples of her urine, faeces and blood. This encounter marked the first time any healthy person in America had been accused of transmitting typhoid fever, a disease responsible for the deaths of 13,000 people in the USA the year before.

Mallon almost certainly thought the engineer’s claim preposterous. It was true that two members of the Park Avenue household in whose kitchen she worked had recently contracted typhoid: a chambermaid, followed, fatally, by the daughter of the homeowner. But Department of Health officials had already blamed the outbreak on the public water supply, and Mallon, who took pride in her work, was surely too clean to be a threat?

Besides, typhoid was everywhere, and Mallon had never had the disease herself, so how could she possibly spread it? Angry at the intrusion into her workplace and the smear against her character, Mallon seized a carving fork and chased her accuser out onto the street.

Mallon’s reaction was the complete opposite of what the sanitary engineer, George Soper, had expected. He later made sense of the cook’s “indignant” and “stubborn” behaviour by classifying her as peculiar and “perverse”, adding descriptions of her walking and thinking “more like a man than a woman” to his otherwise scientific reports. This idea of Mallon as unfeminine, deviant and wayward perhaps made it easier to justify her later treatment, or perhaps even influenced how Soper and others approached her in the first place.

Though the two had not met prior to their Park Avenue showdown, Mallon was identified by Soper as the guilty party in a mystery he had been unravelling for months. Asked to investigate an unexplained typhoid epidemic at the summer house of a New York banker the year before, Soper’s killer clue was the discovery that a cook had started work in the house three weeks before the outbreak and had moved on three weeks afterwards. Soper tracked down this “Irish woman about 40 years of age, tall, heavy, single” and in “perfect health”, through the employment bureau that had placed her in that role.

The next time the pair saw one another, Mallon came home to find Soper waiting for her on the stairs outside her lodgings. She again refused to provide the samples Soper demanded. He responded by recommending that she be taken into custody by the New York City Department of Health. To support his case, Soper shared information about other typhoid outbreaks, dating back seven years, which had all occurred in homes where Mallon had worked. He added that her excrement should be “made the subject of careful bacteriological examination”.

“I called Mary a living human culture tube and chronic typhoid germ producer. I said she was a proved menace to society.”
George Soper, The Curious Career of Typhoid Mary, 1939

Typhoid bacteria from one of the earliest surviving films of bacteriological research, 1910s

Less than a week later, Mallon turned away another visitor requesting samples of her bodily fluids. The following day this new pursuer, Dr Josephine Baker, returned with reinforcements. Three policemen surrounded the Park Avenue house. Another joined Baker on the doorstep and a horse-drawn ambulance parked nearby, ready to take Mallon away. The as-yet-unproven “germ producer” promptly vanished, taking refuge in the outside toilet of a neighbouring house, while someone else piled a dozen ashcans outside its door.

Several hours later the toilet door was finally pried open and Mallon, fighting and cursing, was forced into the ambulance. On the way to hospital Mallon was said to be so “maniacal” that Baker had to sit on her. When they arrived, Mallon’s faeces were collected and analysed. The results confirmed what Soper’s epidemiological investigations had predicted: Mallon was carrying “a pure culture of typhoid”. Though no one quite understood how this could be the case, given Mary’s good health, it’s likely that she had experienced a mild bout of the disease some years earlier, without even noticing.

In 1907 the concept of healthy typhoid carriers was just beginning to generate scientific interest. A year before, a woman who had recovered from typhoid ten years previously had been identified as a carrier in Strasbourg, Germany. Scientists soon realised that the bacteria responsible for typhoid survived in a small percentage of people’s bodies long after any ill effects had passed. Why this happened, no one really knew.

L0033877 Health & Disease in Deadly Combat

White blood cells attacking typhoid in the bloodstream, 1912.

Within a fortnight of Mallon’s capture, the New York press published her story, without revealing her name. Mallon was labelled a “human typhoid germ”, a “danger to the community” and a “walking typhoid fever factory”, further dehumanising a woman who had been arrested, incarcerated and medically examined without charge or trial. Mallon was soon confined on an island less than a kilometre square, where she was examined several times a week. She spent most of the rest of her life there.

In 1909 Mallon hoped to gain her freedom from North Brother Island at a hearing in the Supreme Court, an event that also allowed the press to reveal her name. They responded by inventing a new title, still used to describe a pariah or, according to the Oxford English Dictionary, “a transmitter of undesirable opinions or attitudes”: Typhoid Mary. The newspaper The American, while sympathetic to Mallon’s imprisonment, ran a feature under this title, illustrating it with a drawing of a cook tossing human skulls into a frying pan.

TO-03-06-Article

Mary Mallon is publicly named – and renamed – in The American, June 1909.

“My name is Mary Mallon. I was christened and baptised Mary Mallon. I lived a decent, upright life under the name of Mary Mallon until I was seized. (Then I was) locked up in a pest-house and rechristened ‘Typhoid Mary’, the name by which the world has ever since known me.”
Mary Mallon

Mallon’s case was unsuccessful in the face of laboratory cultures from her own stools, which repeatedly showed the bacteria that caused typhoid. The health department presented this as incontrovertible evidence that Mallon was a danger to society. Her own lawyer argued that Mallon’s constitutional right to due process had been violated. In essence, the two sides debated a thorny issue that remains relevant today: how do we protect the wider public’s health without infringing on individuals’ civil liberties?

After the appointment of a new health commissioner in 1910, Mallon was eventually freed on the condition that she no longer work as a cook. The new commissioner even helped her secure a job in a laundry. For the next two years Mallon seems to have avoided working with food. However, she may have found it difficult to earn enough income to support herself, or perhaps she simply wanted to return to her profession of choice.

In 1914, after having fallen off the Department of Health’s radar, Mallon used an assumed name to take up a job as a cook at a New York maternity hospital. It’s unlikely she deliberately set out to cause harm to those who ate her food. Instead, Mallon almost certainly didn’t believe the science she had been presented with and knew she would be prevented from working as a cook if she used her own name.

When 25 hospital staff contracted typhoid in 1915, and two died, the outbreak was again traced back to Mallon, who was returned to North Brother Island. There she remained, quarantined, until her death. In the intervening period, other healthy carriers of typhoid had been identified in the city, including two men who worked in the food business. These carriers were treated in a manner starkly different to that of Mallon.

L0032926 A sprite (?) representing the disinfectant

A representation of typhoid infection from a French advertisement for disinfectant paper, 1890.

One of them, Belgian-born Alphonse Cotils, owned a New York bakery, in which he continued to work despite officially being forbidden to do so. When taken to court in 1924, Cotils received a suspended sentence. The judge acknowledged the extreme danger Cotils posed, but stated that he could not legally jail him “on account of his health”.

Did Cotils escape confinement due to his gender, his nationality or his successful business? Perhaps he wasn’t considered as much of a danger as Mallon, who had a reputation for being pathologically angry and acting irrationally. Official documents of the time show that Mallon was often described as an “Irish woman”, while other carriers were not usually identified by their race or gender. Comments about her personal appearance or demeanour were also common, while absent in the descriptions of others.

“Why should I be banished like a leper and compelled to live in solitary confinement with only a dog for a companion?”
Mary Mallon

Mallon died on North Brother Island in 1938, 31 years after she was first taken there. Her death was reported by the Lancet medical journal, which described the woman who made her living as a cook as “a chronic typhoid carrier”. The man responsible for reporting her to the Department of Health concluded after her death that Mallon had a “curious career” as “the most famous typhoid carrier who ever lived”. It was, of course, not a career she ever chose to pursue.

The tradesman who confronted the pestilence

The City of London, 1665. As the Great Plague hits the capital, John New faces a deadly dilemma.

V0010611 Victims of the plague in 1665 being lifted on to death carts

Victims of the 1665 plague are lifted on to a death cart, Samuel Wale, 1747.

On 6 August 1665, John New, a weaver in the bustling London parish of Cripplegate, watched his son and daughter die from what local broadsheets were already calling “The Great Plague”. The latest outbreak of this fearful disease, and the efforts taken to control its spread through one of Europe’s largest commercial centres, struck weavers like New, as well as other artisans and tradespeople, particularly hard.

Hours or days before, New would have spotted painful black swellings, known as buboes, around the groins, armpits or necks of his children. Such was the fear of plague spreading through the city that symptoms like this had to be reported to the authorities within just two hours. The children would have soon experienced headaches, vomiting and pain so intense that many victims were overcome with frenzy. Their swellings would have turned red, purple or black and may have ruptured to form open sores. The only mercy for New and his family was that death – a near certainty – came quickly.

Despite published precautions and remedies, the only sure way to avoid this fate – which appeared to befall rich and poor alike – was to leave the city. New would have had plenty of time to consider this course of action. It’s possible he would have heard about the spreading “contagion” two months before, when the first weaver in Cripplegate died from the disease and the city’s playhouses closed. A month later the Lord Mayor issued orders designed to control the infection and, by the middle of June, New couldn’t have failed to notice that Londoners were fleeing the city in droves.

“So homewards and to the Cross Keys at Cripplegate, where I find all the towne almost going out of towne, the coaches and waggons being all full of people going into the country.”
Samuel Pepys in his diary entry of 21 June 1665

People fleeing the plague, first printed 1630.

Even after a hundred local people involved in his trade had died, and the Bishop of London had announced that “many thousands of poore Artisans” were on the brink of starvation due to lack of business, New remained in the pestilential city. In fact, though diarist Samuel Pepys claimed “all the towne” was leaving, tradespeople like New, and the city’s poorest inhabitants, had little option but to stay.

Gallery: scenes from the Great Plague

The well-to-do, like the poet John Dryden – who retired to his father-in-law’s estate in the West Country – and many of London’s physicians, had the means to pay for transport out of the sick city. They left their servants to watch over their expensive properties and were easily able to support themselves for an extended stay away from London.

New would have struggled to find the five-shilling fare for the cheapest waggon out of town. He also would have risked losing his livelihood if he left, since his home (which may have doubled as his place of work) might be ransacked in his absence, resulting in the loss of both goods and tools. Leaving the city would have removed any chance of keeping his business alive. However, staying wouldn’t do much for trade, due to controls placed on the movement of goods and people both before and during the outbreak.

Two years earlier the Privy Council – a small group of men who governed England under King Charles II – had taken action when the plague arrived in Amsterdam. The Royal Navy had intercepted ships in the Thames estuary and asked any vessels from Amsterdam to either turn back or anchor at the remote Canvey Island. There, cargoes were unloaded and aired for 30 days. Crew and passengers had to stay aboard for the same amount of time.

In the early summer of 1664, as deaths in Amsterdam rose, ships from all Dutch ports were held outside London for 40 days, a period known as ‘quarantine’, from the Italian word for 40. Though measures to separate the sick from the healthy have been used to limit – with varying levels of impact – the spread of infectious disease in many countries and periods, London’s approach to plague control was based on a concept first developed in Italy.

Gallery: quarantine procedures and structures

Growing out of the thinking of Florentine scholars and lawyers in the 1400s, the ‘ideology of order’ viewed society as a living organism that needs hierarchy and stability to survive. City authorities could therefore justify imposing draconian measures, which disrupted the lives of ordinary people, during periods of crisis such as epidemics.

The Thames shipping controls appeared to work – at first. During 1664 over 24,000 people died from plague in Amsterdam, while only five cases were reported in London. When the disease finally hit the English city, the Lord Mayor’s plague orders meant anyone wanting to leave required a health certificate. Those with the money to bribe officials would find it easier to obtain one than tradespeople like New.

L0046086 Orders conceived and published

Plague orders published by the Lord Mayor and Aldermen of the City of London, 1665.

The orders required householders to clean the street outside their home every day and insisted that burials only take place at night – with neither family nor friends present. No dogs, cats, rabbits or pigeons were allowed in the city, and businesses that involved travel or gatherings of people had to stop operating. Ale- and coffee-houses closed down, as did markets, theatres and other social spaces. Street sellers and hawkers were prohibited from selling any goods and all trade in second-hand clothes was banned.

Across the city, official examiners and searchers were appointed “to inquire and learn” which homes had been “visited” by illness and to identify those affected by plague. These houses were then shut up for 28 days, complete with all their inhabitants – whether infected or not. Plague homes were marked by a foot-high red cross on their doors and the words “Lord have mercy on us”. Watchmen supplied the inhabitants with food, while ensuring none of them left the property. Nurses could be provided at the cost of the parish.

“A band of Halberts mustred were, to guard –
The People from the Plague in evry Ward
And if they found by making inquisition
(Or had but any probable suspition)
Where lodgd it had (although but for a night)
That man was baisht from the publick sight,
Imprisned in his house both night and day,
As one that meant the Citie to betray.”
George Wither in his epic poem about an earlier plague outbreak, The History of the Pestilence, 1625

The practice of shutting up the healthy alongside the sick was unique to England. Local writer Daniel Defoe, aged just six at the time, later described “the confining the Sound in the same House with the Sick” as “a great Subject of Discontent”. Unsurprisingly the practice prompted many complaints. Shutting up was variously thought to cause more disease, bring on a melancholy that made one more susceptible to it, obstruct trade and add to the demands on parish budgets, which now had to support both the well and the sick.

“This shutting up would breed a Plague if there were none: Infection may have killed its thousands, but shutting up hath killed its ten thousands.”
Anonymous, The Shutting Up Infected Houses as it is Practised in England Soberly Debated, 1665

What officials attempting to stop the spread of plague didn’t know at the time was that it is very rarely transmitted from human to human. Plague is caused by a bacterium that usually infects small rodents, including black rats. When one of these rats dies, fleas feeding on its blood will move to the next best thing – in a shut up house, this could be the nearest human, who might otherwise have been spared.

V0020711 A black rat sitting upright on the ground. Etching by W. S.

A black rat, William Samuel Howitt, 1808.

As the number of infected people in the city grew, it became difficult to police the shutting up of houses. By the time New’s children became ill, anyone who did not comply with the plague orders could be legally restrained, though some ‘inmates’ still managed to escape. Meanwhile, many of those in the upper echelons of society simply ignored the rules. Neighbours of the Lord Mayor himself were arrested when they tried to shut up his house, which became infected in September.

Even citizens free to move around the deserted city changed their behaviour. People walked in the middle of the street to avoid contact with others, only entered shops if no other customers were present and preferred not to handle money. All this clearly affected trade. Those involved in cloth making were, perhaps, worst affected, since plague was thought to be transmitted by the movement of fabrics. In hindsight this has some truth; fleas can survive for several weeks outside a living host and might easily be moved around – and out of – the city within bundles of cloth.

“All Trades being stopt, Employment ceased; the Labour, and by that, the Bread of the Poor were cut off… These might be said to perish, not by the Infection it self, but by the Consequence of it; indeed, namely, by Hunger and Distress, and the Want of all Things; being without Lodging, without Money, without Friends, without Means to get their Bread.”
Daniel Defoe, A Journal of the Plague Year, 1722

In this climate, John New’s life would have been strained even without the pain of losing his children. As it was, he didn’t have to struggle through the worst days of the crisis, which arrived later in August and September. New died just three days after his children, one of an estimated 100,000 people who perished before the end of October, when cooler weather arrived and London’s most recent epidemic subsided.

The stranger who started an epidemic

New Orleans, 1853. James McGuigan arrives in the port city and succumbs to yellow fever.

L0074835 Development of yellow fever

A patient in the final stages of yellow fever, observed at Cadiz, Spain in 1819.

On 27 May 1853, 26-year-old James McGuigan admitted himself to the Charity Hospital in New Orleans saying he had felt sick for four days. Within hours he became delirious and, early next morning, he threw up black vomit, a symptom familiar to anyone who had lived through New Orleans’s regular yellow fever outbreaks. By 6am McGuigan was dead, the first fatality in the worst epidemic any American city had ever experienced.

McGuigan, an Irishman, was a stranger to New Orleans. He had arrived in America’s fifth-largest city just 17 days before, aboard a ship carrying 314 Irish immigrants from the British port of Liverpool. Two days later a sailor from another newly arrived ship, the Augusta, transporting passengers from Britain, Germany and other European countries, died in the same hospital. At the autopsy, local physician Dr Erasmus Darwin Fenner observed the yellow colour of the sailor’s skin. Black vomit was found in his stomach.

These apparent cases of yellow fever, though occurring unusually early in the year, prompted Fenner to launch a “scrutinizing investigation”. The doctor’s inquiries revealed that the Augusta had travelled up the Mississippi river together with a ship from Kingston, Jamaica, where yellow fever was rife. What’s more, Fenner discovered that “free communication” had taken place between the two vessels and that the Augusta ended up mooring “not more than a hundred yards” from McGuigan’s ship.

Fenner’s focus on the ships that carried these two men to New Orleans was not unusual. At the time of their deaths, no one understood how yellow fever spread or knew, for sure, whether it was contagious. Among other scapegoats, the disease was blamed on ships, distant countries and poor sanitation. Immigrants like McGuigan were also blamed for bringing the disease on themselves – and contributing to its spread through cities – because of their supposedly immoderate lifestyles.

“Contagionists have attributed our yellow fever (the paternity of which no nation is willing to own) to Siam, where it was never known, and for no better reason than that the country itself is at the uttermost end of the earth…”Bennet Dowler, Tableau of the Yellow Fever of 1853, 1854

In early July, as the annual influx of mosquitoes swamped the city, one local newspaper described New Orleans as looking and smelling “epidemical”. Some residents responded by fleeing, but many remained calm. Yellow fever, after all, was known as the strangers’ disease. Ever since the earliest city epidemics in the late 1700s, everyone had known foreigners and travellers from other parts of America were much more susceptible than those born in New Orleans.

TO-01-04-DataViz-updated

Yellow fever deaths in New Orleans by place of birth, 1 May to 31 October 1853.

 

TO-01-05-DataViz-updated

Proportion of yellow fever deaths in New Orleans by place of birth, 1 May to 31 October 1853.

Accepted medical wisdom ranked newcomers to the city as those most at risk because they had not yet become used to the subtropical climate that was so very different from their own. Locals, on the other hand, were thought to adjust, or acclimate, gradually over time.

“The Yellow Fever was denominated Strangers Fever, (it being the same disease,) because while it attacks strangers, or those who are not acclimated… the native adult, and those who are acclimated are exempt.”
Thomas Y Simons, A Report on the History and Causes of the Strangers or Yellow Fever of Charleston, 1839

The idea of ‘acclimating’ (or not) fitted with both official and unofficial records of death rates. A businessman writing about an earlier epidemic that occurred in 1847 attributed nine-tenths of all funerals within a fortnight to Irish victims; six years later, immigrants accounted for 90 per cent of recorded yellow fever deaths, despite making up less than half the city’s population.

In general, Irish and German immigrants appeared most likely to die from the disease. The 1853 death rates for both groups were reported to be 20 times higher than that for native New Orleanians. The ‘stranger’ factor was even recognised by the Mutual Benefit Life and Fire Insurance Company of Louisiana, which charged high premiums to Americans travelling to New Orleans during the summer months on the basis that these visitors were not acclimated to the area.

However, acclimation wasn’t the only factor thought to lead to high death rates in some marginalised groups. Most newspapers, physicians and the public associated the origin and spread of yellow fever with the lowest levels of society and the “miserable, filthy, loathsome manner” in which these communities were thought to live. More judgmentally, the local shipping clerk who penned that phrase also positioned those who appeared most susceptible to the disease as “a set of rumdrinking, fighting people”, clearly something ‘other’ than his own kind.

Medical opinion followed a similar line, with the editor of the New Orleans Medical and Surgical Journal suggesting that the prevalence of yellow fever could be reduced by raising both the social and moral conditions of the labouring classes. Local physician Dr J S McFarlane felt visitors to the city could exempt themselves from infection by remaining sober and living an “orderly” life, a habit that might be difficult for those immigrants forced to sleep a dozen to a small single room.

TO-01-06-Yellow-Fever

The “fatal epidemic” of yellow fever in New Orleans from The History of Yellow Fever in New Orleans, 1854.

Some people viewed yellow fever epidemics as a blessing that kept the number of these undesirable immigrants in the population down. Even overlooking this extreme view, the categorisation of yellow fever as a disease of the lower classes, which left respectable residents either uninfected or only mildly ill, meant tackling the issue was hardly a priority for health officials, who themselves blamed poor sanitary conditions.

In the absence of scientific explanations about how diseases spread, medical theories often reflect the social attitudes and prejudices of the time. At the time of the 1853 epidemic, social attitudes in New Orleans were inevitably influenced by the daily arrival of hundreds of immigrants, who ended up living in cramped conditions in the poorest parts of town.

“Every evil with which we have to contend is introduced by strangers.”
J S McFarlane, The Epidemic Summer, 1853

Only 20 years before, the US Census recorded just 46,000 residents in New Orleans. During the intervening period, as the Mississippi river trade boomed, the population grew rapidly. Americans from the northern states moved south to take jobs as clerks, doctors and lawyers. Irish immigrants like McGuigan, fleeing the potato famine of the late 1840s, and Europeans from countries undergoing revolutions took jobs as labourers. By 1852 the city census measured the population as 145,000.

V0010539 André Mazet tending people suffering from yellow fever in th

The devastating impact of a yellow fever epidemic in Barcelona, 1821.

Two months after newcomer McGuigan had walked into the Charity Hospital, yellow fever was decimating New Orleans. In late July patients were forced to lie on the hospital’s floors as it took in 100 new cases a day. The mortuary “teemed with frightful corpses” and local graveyards struggled to cope with demand. Since residents who had stayed in the city were spending their time tending to the sick, trade came to a standstill.

All this chaos was, we now know, ultimately down to the Aedes aegypti mosquito. If the insect draws blood from a yellow fever victim, it can transmit the viral disease, 10 or 12 days later, to anyone else it bites. These mosquitoes breed in urban areas and live within a relatively constrained area. Cities, and more specifically closely packed housing and cramped rooms – like those where new immigrants to New Orleans lodged – therefore become breeding grounds for both these mosquitoes and the disease they transmit.

TO-01-12-W0002050

A mosquito, Aedes aegypti, painted by Sir Philip Manson-Bahr.

Once bitten by a yellow-fever-carrying mosquito, newcomers were less likely to be able to deal with the infection than locals, since they lacked any immunity from previous exposure to the disease. Rather than gradually acclimating, locals and strangers alike developed immunity as they lived through yellow fever outbreaks. One of the reasons the 1853 epidemic’s death toll was so high was because New Orleans had not experienced a major outbreak for several years, so few of the 100,000 recent arrivals had built up immunity, unlike the ‘strangers’ who had lived there long enough to be involved in previous incidents.

By the time the epidemic subsided in October 1853, a tenth of the city’s population had been buried and a further 40 per cent had suffered with the disease. Just over a century later, writer Susan Sontag described illness as “the night-side of life”, a place we all, at some point in our lives, are forced to “emigrate” to. In New Orleans half the population were forced to emigrate there, many of these being immigrants who were already viewed as something ‘other’ than the local norm.

Of course, yellow fever isn’t the only disease to be linked with a marginalised group. Many infectious diseases have been described as conditions from other places or have been blamed on specific demographics. Groups of people classified by race, class or sexuality have all carried – at some time or other – the stigma of diseases, such as leprosy, cholera, syphilis and AIDS.

We don’t know if young James McGuigan was a “rumdrinking, fighting” man or not, but some people felt he and his fellow countrymen were more susceptible to infection because of such behaviour, and acted accordingly. In fact, McGuigan’s final jaundiced hours, spent haemorrhaging uncontrollably and regurgitating blood, were simply down to a bite from a mosquito carrying a disease he had not previously encountered.

Gallery: Diseases linked to marginalised groups

For Henry Adams, the gigantic electrical machinery was both seductive in its grandeur but also spiritually alarming.

Titans in the landscape

From hydroelectric dams to pylons, the 20th-century architecture of electricity inspired a new kind of awe.

The architecture of electricity had as awe-inspiring an appearance as the machinery it housed. It was often described in quasi-religious language such as ‘temples’ or ‘cathedrals’.

The architecture of electricity had as awe-inspiring an appearance as the machinery it housed. It was often described in quasi-religious language such as ‘temples’ or ‘cathedrals’.

“Soon, like Orion’s belt of fire,
Its broad electric arm shall hold,
With all a monarch’s strong desire,
The world and all its varied fold!
And from its tongue through every sphere
Till Time and Earth together cease,
Mankind the glorious tale shall hear
Of commerce, brotherhood and peace!”
E J O’Reilly, The Atlantic Cable, 1858

Printed in a Canadian newspaper to celebrate the laying of the transatlantic cable in 1858, E J O’Reilly’s poem prophesied the conquest of the world by electrical technology. The Atlantic Cable eulogised the unstoppable march of progress promised by electricity, and the international prosperity and friendship it would bring in its wake. Human mastery of nature was integral to this vision of modernity.

The possibility of wireless communication over distance, another testimony of electricity’s invisibility and impalpability, only increased the public’s sense of wonder.

The possibility of wireless communication over distance, another testimony of electricity’s invisibility and impalpability, only increased the public’s sense of wonder.

Away from the unseen cable running across the bed of the Atlantic ocean, visible manifestations of the electricity network began to spread through the urban and rural landscape. These colossal markers of the steady spread of electrical power – power stations, transmission towers, dams, electric rails – were formidable signifiers of human control over the natural environment.

For many, electrical plants and powerhouses were not seen as carbuncles on the face of nature but, like the telegraphic cable, monuments to modernity. Some became must-see sites on American tourist trails. Henry Ford’s powerhouse at his Highland Park factory had large windows designed to provide magnificent views into the machine rooms. The English writer Arnold Bennett was awestruck after a visit to a New York power station:

“Immaculately clean… shimmering with brilliant light under its lofty and beautiful ceiling, shaking and roaring with the terrific thunder of its own vitality, this hall in which no common voice could make itself heard produced nevertheless an effect of magical stillness, silence, and solitude… It was a hall enchanted and inexplicable.”

Hydro-electric power stations such as the one that opened at Niagara Falls in the 1890s also became popular tourist destinations. H G Wells was one of many visitors who were more interested in the electrical machinery than in the falls themselves. He wrote that the Niagara Falls dynamos represented the “human will made visible, thought translated into easy and commanding things”. They were “clean, noiseless, and starkly powerful… noble masses of machinery, huge black slumbering monsters, great sleeping tops that engender irresistible forces in their sleep”.

Electricity’s potential significance for transportation, leisure, technology and entertainment prompted as keen a public interest as light shows.

Electricity’s potential significance for transportation, leisure, technology and entertainment prompted as keen a public interest as light shows.

A similarly sublime expression of electricity’s power was offered by the American writer Frank Waters, who in 1946 declared the Hoover Dam to be “the Ninth Symphony of our day” and “The Great Pyramid of the American Desert”. It was enormously popular with tourists, and was visited by 750,000 people in 1934–35: as many as visited the Grand Canyon in the same year.

When long-line transmission systems began to criss-cross the American landscape, reaction was often enthusiastic. The Los Angeles Times wrote in 1913 about how “electric energy from the far-off Sierras stretched a hand robed with lightning across the gulf of valleys and mountains to the doors of the city.” In the 1920s the Chicago architect E H Bennett acknowledged that “to the mind of any imagination there is at times something irresistibly fine in the aspect of great airy structures stalking the hills”. Transmission towers also became, like the transatlantic cable, the unlikely objects of poetic veneration. In 1933 Stephen Spender wrote his paean to pylons, emblems of progress and modernity:

“But far above and far as sight endures
Like whips of anger
With lightning’s danger
There runs the quick perspective of the future.”

But as the century wore on, attitudes to the ‘iron forests’ shifted and opposition became more vocal. The environmental movement gathered momentum in the 1960s, and the ’march of the towers’ was increasingly decried as a desecration of the landscape, much as wind turbines are today.

It was not only the aesthetics of the creeping electrical network that caused disquiet. The bewildering technological potential embodied in the machinery of electricity also prompted unease and anxiety, perhaps best expressed by the historian Henry Adams in his 1907 autobiography. He wrote of the dynamo’s “huge wheel, revolving within arm’s-length at some vertiginous speed”, whose silent power seemed to possess an occult and incomprehensible mystery. Adams feared that worshipping this symbol of the modern machine age had the potential to irrevocably displace other, more “spiritual” values such as art, or faith. Hinting at his ambivalence he composed his Prayer to the Dynamo: “Mysterious Power! Gentle Friend! Despotic Master! Tireless Force!”

This illustration of a speeding electrically powered train engine perfectly illustrates the association of electricity with speed, progress, and modernity.

This illustration of a speeding electrically powered train engine perfectly illustrates the association of electricity with speed, progress, and modernity.

The fear that Adams expressed about the mysterious industrial apparatus of electricity was not unlike the superstitious anxieties that Thomas Edison had observed among his workforce some 30 years earlier. According to Edison, when his electrified cables were to be buried underground, “the Irish laborers of the day were afraid of the devils in the wires”. This perception of electricity’s supernatural power is probably what lay behind Edison’s epithet, the ‘Wizard of Menlo Park’.

It may seem ironic that the mysterious qualities of electricity – its invisibility and immateriality, its silent power and intangible force – prompted such fears and also inspired enchantment and wonder. Yet electricity is full of ambiguities: its death-dealing and life-giving force, its capacity to illuminate and to extinguish, to heal and to inflict pain, to reawaken and to annihilate. Thunderous lightning, electric eels, galvanised corpses, floodlit facades and monumental machinery have forged the most profound emotions in human beings. Through the words and images of those who have encountered its enigmatic and inscrutable power – whether poets, historians, engineers or scientists – we can begin to understand how electricity has shaped our deepest feelings.