The colonist who faced the blue terror

India, 1857. In a British enclave, Katherine Bartrum watches her friend, and then her family, succumb to the deadly cholera.

L0074303 Young girl suffering from cholera.

A girl suffering from cholera.

At 3pm on 29 June 1857, 23-year-old Katherine Bartrum, an Englishwoman living within the fortified walls of a British complex in the north Indian city of Lucknow, watched as her friend was taken ill with cholera. It was the disease most feared by British residents in India, but also by their compatriots back home, both for its rapid and horrific onset and for what it came to symbolise.

“There are few diseases which have excited more interest among medical men, or more terror in the mind of the Indian community at large, than the epidemic cholera.”
James Annesley (of the Madras Medical Establishment), Sketches of the Most Prevalent Diseases of India, 1825

During Bartrum’s bedside vigil her friend would have experienced severe diarrhoea, losing litres of fluid. Vomiting and writhing in pain, her thirst would have been unquenchable and her eyes and cheeks may have sunk into her face. Most startlingly, her lips, fingernails and skin would probably have turned an eerie shade of blue.

Within three hours the cold and clammy “dews of death” gathered on the patient’s brow and she lost consciousness. By 8pm, as her now motherless child slept unawares, Bartrum’s friend was already in her coffin. Surprisingly for the wife of a medical officer, this swift demise was Bartrum’s first experience of “death in any shape”. Yet, just a month before, cholera had claimed the life of the Commander-in-Chief of British India. Bartrum was also destined to encounter the disease again in the coming weeks.

V0010485 A young Viennese woman, aged 23, depicted before and after

A 23-year old woman before and after contracting cholera.

Cholera had been known in India for hundreds if not thousands of years, but for centuries it was limited to the Bengal region in the east. The “blue terror” travelled across India – and beyond – as the British expanded their grip on a country that had been under the control of the British East India Company for a century.

As the leading cause of death among British troops in India, cholera earned itself a reputation as an insidious, violent enemy always ready to attack. The British viewed Indians – and their “very loose habits” – as the natural cause of the disease, but the British themselves acted as carriers. Their large-scale troop movements aided cholera’s emergence from Bengal, British soldiers fighting on India’s northern borders introduced the disease to their Afghan and Nepalese opponents, and British troops carried it to the Persian Gulf when they were deployed to Oman.

L0074539 Actual & supposed routes of Cholera from Hindoostan to Europe

Nineteenth-century map showing routes of cholera from India to Europe and North America.

Even civil interventions by the colonial power contributed to cholera’s spread. By the time Bartrum arrived in Lucknow, the country’s first railway and the world’s largest canal – a network of routes spanning over 1,000 km – had opened. Both aided cholera’s expansion across the country.

Many Indians blamed the British for cholera’s spread, albeit for different reasons. Some believed cholera was meted out as divine retribution when the British defiled holy places or slaughtered cows, which are considered sacred in the Hindu religion. Others felt the disease was caused by deities who resented British rule. Since Indians were just as likely to catch cholera as the colonists, this meant the wrath of these gods was also targeted at Indians, who had failed to stand up to the British.

Cholera reached the heart of the British Empire too. When the first of four major cholera epidemics hit Britain in 1831, killing around 30,000 people, this ‘new’ disease sparked increased debate and a frenzy of analysis. For many years opinion was divided between those who believed cholera was spread through contact and those who blamed bad air and/or the effects of soil temperature. In the ten years before Bartrum arrived in Lucknow, efforts to understand the disease led to the publication of over 700 cholera-related books in London alone.

Gallery: studies of cholera outbreaks and causal factors

These studies served a purpose as epidemiological tools, but they also gave credence to politicised social policies. As the science of epidemiology developed, medicine shifted away from analysing the behaviour of individuals to investigating issues related to entire populations. These ranged from the nature of the water supply in specific parts of a city to the characteristics thought to be shared by a particular race. In the eyes of 19th-century Brits, the people of India – who were once viewed as fastidiously clean – were thought to be disorderly and dirty.

“The habits of the natives are such that, unless they are closely watched, they cover the entire neighbouring surface with filth.”
Royal Commission on the Health of the Anglo-Indian Army, 1863

Bartrum’s uninviting house in Lucknow was certainly dirty, but filth was also a fact of life in England. However, the 1848 Public Health Act – prompted by Edwin Chadwick’s report on the sanitary conditions of the labouring classes – now set England apart from India. While the home country was taking steps to bring filth and disease under control, India was viewed as stagnant and lacking self-discipline, like the immature child of the great British parent. In this climate, cholera came to symbolise the aspects of Indian society most feared by Europeans.

“One is no less saddened to see the populace as cruelly decimated by this horrible scourge in Berlin, London, and Paris, which stand at the head of modern civilization, as in the backward nations of the Orient and Northern Europe.”
Gazette médicale de Paris, 1832

Fear was the reason Bartrum herself had come to Lucknow. She had arrived seven weeks before, leaving her husband at another military station after Indian soldiers mutinied and killed civilian Europeans living in the city of Delhi. This event marked the start of India’s First War of Independence, and soon led to a six-month siege of the city where Bartrum had taken refuge. From this point on, to British eyes, India was increasingly a place of barbarism.

V0011353 John Bull defending Britain against the invasion of

John Bull defending Britain against the invasion of cholera, 1832.

 

L0000611 Broadsheet warning about Indian cholera 1831

Poster warning of the “alarming approach” of what was described as “Indian” cholera, produced in London in 1831.

In London, Dr John Snow had argued in 1849 that cholera was caused by swallowing poisonous matter that was transmitted through faeces and contaminated water. However, his views did not gain acceptance until at least a decade later. In the meantime, the British medical establishment maintained the stance that Indians were somehow fundamentally different to Europeans. Though scientific investigations found little evidence that race played any role, Indians were inextricably linked with the cholera they were thought to produce.

“Their ways of living are not ours, and for hygienic reasons… close proximity is not desirable.”
Kate Platt, The Home and Health in India and the Tropical Colonies, 1923

L0006579 Engraving: 'Monster Soup..." by William Heath

Nineteenth-century caricature revealing the microscopic impurities found in London’s drinking water.

Of course, if India and Indians were viewed as irredeemably unsanitary, the British administration could excuse itself from spending time and money trying to improve conditions. Susceptible areas in England were seen as unhealthy and vulnerable until improved; India, on the other hand, was beyond hope. Medical theories – despite the evidence – supported the differing political moods at home and abroad.

In England cholera was an alien invader, a colonist in its own right, occupying both the body and the land. As epidemic followed epidemic, people feared the disease might eventually ‘settle’, taking over the country. At the same time, the British administration in India prioritised the health and comfort of its own troops above all else. The Indians now fighting to eject the British from their homeland had to live in far worse conditions.

Just as these Indian rebels laid siege to Lucknow, Katherine Bartrum’s 17-month-old son Bobbie contracted cholera. Though the doctor told Bartrum her son was dying, she administered “the strongest remedies that could be given to a child” and knelt by his bed all night. By morning the outlook was better: Bobbie “began to revive, sat up, and looked so bright”.

Despite being struck down herself the following day, and discovering two months later that her husband had been killed in action, Bartrum and her son managed to survive until the British withdrew from Lucknow four months later. The pair then travelled to Calcutta and boarded a ship bound for England. The night before it set sail, Bobbie, who had been growing weaker by the day, died.

L0025760 Broadsheet: Cholera and Water, 1866

Poster advising residents of east London not to drink unboiled water during the 1866 cholera epidemic.

When Bartrum arrived back in England, London was in the midst of the ‘Great Stink’, a summer in which the stench of excrement from the Thames became so intolerable that politicians launched a project to develop a citywide sewer system. England experienced its last cholera outbreak eight years later. In London it was localised to an area not yet connected to the new sewage network. But in India millions of people died in later outbreaks. Today cholera remains, as it was before the 1800s, endemic in some areas of the country.

Sharing nature: making connections

As part of our Sharing Nature project, over the past fortnight we asked you to share your photos on the theme EMOTION, and respond to other people’s submissions. You decided Magda Harmon’s contribution was most meaningful.

Trees

“Connectedness. We are nothing without nature. We are nature. Nature is us. Our veins, branches of an ancient oak, the Amazon river – All is One.” The photograph and words Magda Harmon submitted on the theme EMOTION.

For Magda, nature is all about feeling connected. Her photograph looking up into a tree canopy that’s just coming back into leaf illustrates that emotion. Rather than viewing vegetable life as something separate and other to herself, Magda sees interconnections, believing “all is one”.

Magda also notes the patterns the branches create against the creamy grey sky, and draws out the similarities the shapes have with a river, and with veins. She has a point. Looking through Wellcome Collection’s image library, there are parts of human and animal bodies that are rather root-, branch- and river-like, including these:

In an article for The Conversation, Richard Taylor, Director of the Materials Science Institute and Professor of Physics at the University of Oregon, asks us to consider the tree as an example of something that’s ‘fractal’: “First you see the big branches growing out of the trunk. Then you see smaller versions growing out of each big branch. As you keep zooming in, finer and finer branches appear, all the way down to the smallest twigs.” Taylor says these fractals, or repetitive patterns, are one of the key things that makes a work of art or a natural scene visually appealing and stress relieving. So, just looking at Magda’s beautiful, fractal tree photograph, and perhaps feeling some kind of emotional connection with it, could be doing you the world of good.

Sharing Nature continues until 1 October 2017, and upcoming themes include relationships, dead, green, alone, plastic, health, and consume. A museum of modern nature is at Wellcome Collection until 8 October 2017.

The cook who became a pariah

New York, 1907. Mary Mallon spreads infection, unaware that her name will one day become synonymous with typhoid.

L0062056 Head and neck of a patient suffering from typhoid fever

A typhoid patient, 1882.

In March 1907 a cook named Mary Mallon was visited by a sanitary engineer at her place of work, a swanky townhouse on New York’s Park Avenue. The unexpected caller told Mallon he suspected her of making people sick and requested samples of her urine, faeces and blood. This encounter marked the first time any healthy person in America had been accused of transmitting typhoid fever, a disease responsible for the deaths of 13,000 people in the USA the year before.

Mallon almost certainly thought the engineer’s claim preposterous. It was true that two members of the Park Avenue household in whose kitchen she worked had recently contracted typhoid: a chambermaid, followed, fatally, by the daughter of the homeowner. But Department of Health officials had already blamed the outbreak on the public water supply, and Mallon, who took pride in her work, was surely too clean to be a threat?

Besides, typhoid was everywhere, and Mallon had never had the disease herself, so how could she possibly spread it? Angry at the intrusion into her workplace and the smear against her character, Mallon seized a carving fork and chased her accuser out onto the street.

Mallon’s reaction was the complete opposite of what the sanitary engineer, George Soper, had expected. He later made sense of the cook’s “indignant” and “stubborn” behaviour by classifying her as peculiar and “perverse”, adding descriptions of her walking and thinking “more like a man than a woman” to his otherwise scientific reports. This idea of Mallon as unfeminine, deviant and wayward perhaps made it easier to justify her later treatment, or perhaps even influenced how Soper and others approached her in the first place.

Though the two had not met prior to their Park Avenue showdown, Mallon was identified by Soper as the guilty party in a mystery he had been unravelling for months. Asked to investigate an unexplained typhoid epidemic at the summer house of a New York banker the year before, Soper’s killer clue was the discovery that a cook had started work in the house three weeks before the outbreak and had moved on three weeks afterwards. Soper tracked down this “Irish woman about 40 years of age, tall, heavy, single” and in “perfect health”, through the employment bureau that had placed her in that role.

The next time the pair saw one another, Mallon came home to find Soper waiting for her on the stairs outside her lodgings. She again refused to provide the samples Soper demanded. He responded by recommending that she be taken into custody by the New York City Department of Health. To support his case, Soper shared information about other typhoid outbreaks, dating back seven years, which had all occurred in homes where Mallon had worked. He added that her excrement should be “made the subject of careful bacteriological examination”.

“I called Mary a living human culture tube and chronic typhoid germ producer. I said she was a proved menace to society.”
George Soper, The Curious Career of Typhoid Mary, 1939

Typhoid bacteria from one of the earliest surviving films of bacteriological research, 1910s

Less than a week later, Mallon turned away another visitor requesting samples of her bodily fluids. The following day this new pursuer, Dr Josephine Baker, returned with reinforcements. Three policemen surrounded the Park Avenue house. Another joined Baker on the doorstep and a horse-drawn ambulance parked nearby, ready to take Mallon away. The as-yet-unproven “germ producer” promptly vanished, taking refuge in the outside toilet of a neighbouring house, while someone else piled a dozen ashcans outside its door.

Several hours later the toilet door was finally pried open and Mallon, fighting and cursing, was forced into the ambulance. On the way to hospital Mallon was said to be so “maniacal” that Baker had to sit on her. When they arrived, Mallon’s faeces were collected and analysed. The results confirmed what Soper’s epidemiological investigations had predicted: Mallon was carrying “a pure culture of typhoid”. Though no one quite understood how this could be the case, given Mary’s good health, it’s likely that she had experienced a mild bout of the disease some years earlier, without even noticing.

In 1907 the concept of healthy typhoid carriers was just beginning to generate scientific interest. A year before, a woman who had recovered from typhoid ten years previously had been identified as a carrier in Strasbourg, Germany. Scientists soon realised that the bacteria responsible for typhoid survived in a small percentage of people’s bodies long after any ill effects had passed. Why this happened, no one really knew.

L0033877 Health & Disease in Deadly Combat

White blood cells attacking typhoid in the bloodstream, 1912.

Within a fortnight of Mallon’s capture, the New York press published her story, without revealing her name. Mallon was labelled a “human typhoid germ”, a “danger to the community” and a “walking typhoid fever factory”, further dehumanising a woman who had been arrested, incarcerated and medically examined without charge or trial. Mallon was soon confined on an island less than a kilometre square, where she was examined several times a week. She spent most of the rest of her life there.

In 1909 Mallon hoped to gain her freedom from North Brother Island at a hearing in the Supreme Court, an event that also allowed the press to reveal her name. They responded by inventing a new title, still used to describe a pariah or, according to the Oxford English Dictionary, “a transmitter of undesirable opinions or attitudes”: Typhoid Mary. The newspaper The American, while sympathetic to Mallon’s imprisonment, ran a feature under this title, illustrating it with a drawing of a cook tossing human skulls into a frying pan.

TO-03-06-Article

Mary Mallon is publicly named – and renamed – in The American, June 1909.

“My name is Mary Mallon. I was christened and baptised Mary Mallon. I lived a decent, upright life under the name of Mary Mallon until I was seized. (Then I was) locked up in a pest-house and rechristened ‘Typhoid Mary’, the name by which the world has ever since known me.”
Mary Mallon

Mallon’s case was unsuccessful in the face of laboratory cultures from her own stools, which repeatedly showed the bacteria that caused typhoid. The health department presented this as incontrovertible evidence that Mallon was a danger to society. Her own lawyer argued that Mallon’s constitutional right to due process had been violated. In essence, the two sides debated a thorny issue that remains relevant today: how do we protect the wider public’s health without infringing on individuals’ civil liberties?

After the appointment of a new health commissioner in 1910, Mallon was eventually freed on the condition that she no longer work as a cook. The new commissioner even helped her secure a job in a laundry. For the next two years Mallon seems to have avoided working with food. However, she may have found it difficult to earn enough income to support herself, or perhaps she simply wanted to return to her profession of choice.

In 1914, after having fallen off the Department of Health’s radar, Mallon used an assumed name to take up a job as a cook at a New York maternity hospital. It’s unlikely she deliberately set out to cause harm to those who ate her food. Instead, Mallon almost certainly didn’t believe the science she had been presented with and knew she would be prevented from working as a cook if she used her own name.

When 25 hospital staff contracted typhoid in 1915, and two died, the outbreak was again traced back to Mallon, who was returned to North Brother Island. There she remained, quarantined, until her death. In the intervening period, other healthy carriers of typhoid had been identified in the city, including two men who worked in the food business. These carriers were treated in a manner starkly different to that of Mallon.

L0032926 A sprite (?) representing the disinfectant

A representation of typhoid infection from a French advertisement for disinfectant paper, 1890.

One of them, Belgian-born Alphonse Cotils, owned a New York bakery, in which he continued to work despite officially being forbidden to do so. When taken to court in 1924, Cotils received a suspended sentence. The judge acknowledged the extreme danger Cotils posed, but stated that he could not legally jail him “on account of his health”.

Did Cotils escape confinement due to his gender, his nationality or his successful business? Perhaps he wasn’t considered as much of a danger as Mallon, who had a reputation for being pathologically angry and acting irrationally. Official documents of the time show that Mallon was often described as an “Irish woman”, while other carriers were not usually identified by their race or gender. Comments about her personal appearance or demeanour were also common, while absent in the descriptions of others.

“Why should I be banished like a leper and compelled to live in solitary confinement with only a dog for a companion?”
Mary Mallon

Mallon died on North Brother Island in 1938, 31 years after she was first taken there. Her death was reported by the Lancet medical journal, which described the woman who made her living as a cook as “a chronic typhoid carrier”. The man responsible for reporting her to the Department of Health concluded after her death that Mallon had a “curious career” as “the most famous typhoid carrier who ever lived”. It was, of course, not a career she ever chose to pursue.

Tamagotchi keyholders

Tamagotchi and me: a personal and potted history of digital pets

A Japanese import began my lifelong relationship with virtual animals. But what happens when they die?

Adam Levene.

That’s who started it.

I’d like to say it began behind the bike shed, but Adam wasn’t that sort of kid, and if Adam wasn’t that sort of kid, then I was about as far from that sort of kid as it was possible to get. Case in point:

jenna-gameboy

Start ’em young: me with original Game Boy, circa 1990.

But Adam was the first, and I remember the day he walked into Year 8, hands cupped protectively around the forbear of digital pets. There, in his palm, in all its egg-shaped and brightly-coloured plastic glory, lay a Tamagotchi.

By the end of the week I had one too, and so it seemed did the rest of the world.

If you’ve never heard of a Tamagotchi, because you were not yet alive in the 1990s, avoided children and/or general humanity back then, or just couldn’t give a flying Furby, let me explain. A Tamagotchi was a handheld digital pet, the brainchild of Aki Maita and Yokoi Akihiro of Japanese toy company Bandai. They were unleashed in their native habitat in November 1996, and colonised the rest of the world in 1997. They proliferated quicker than anything conceived behind a bike shed because there had never been anything quite like it. They were every kid’s dream come true of their toy magically springing to life.

Commercial for second generation Tamagotchi. Released just months after the first generation 1 with added ‘pause functionality’ to mitigate criticism that the toy was being disruptive in schools.

These wee digital beasties existed in real time and needed constant beep-beep-beep-beep-beep-beep attention: feeding, cleaning, playing, medicating. If you didn’t pay them that attention, they evolved into even more demanding adults. And if you kept on neglecting them, they’d die. Just let that sit for a moment: straight up digital death. As per the law of games, you could always play again, but each individual Tamagotchi’s life was finite.

For the original Japanese Tamagotchi, death was represented as an ominous ghoul-like creature hovering next to a gravestone, but for the first wave of US and European models, death was sanitised – naturally – to a sparkly three-eyed alien angel. In fact, the representation of ‘Game Over’ in the generations of Tamagotchi released since 1996 has its own fascinating history, but the one constant has always been death.

tamagotchi-death-japan

‘Death’ animation in the first generation Japanese Tamagotchi.

Tamagotchi could also die from old age, but they only reached advanced years (real time days) if you looked after them properly. No one in my school managed to do that, apart from, you guessed it: Adam Levene. Mine invariably evolved into neglected monstrosities with horribly shortened life spans. I got to experience both horrific grief and murderer’s guilt at the age of twelve: a deeply important life lesson for any pre-pubescent.

tamagotchi-death-us

‘Death’ animation in the first generation US Tamagotchi.

With due diligence to journalistic integrity, I don’t actually remember feeling much of a wrench when one of my aberrations expired, but I do remember when Adam’s first one copped it. He was inconsolable, weeping in the reading corner, shrugging us all away when we asked if everything was OK. Perhaps it was because Adam had put a little bit more time and effort into keeping his Tamagotchi alive, or maybe he was just a more sensitive and empathetic kid than the rest of us cold-hearted brutes. Adam wasn’t alone; there were tales of bereaved kids the world over.

This emotional attachment was given its own name, the ‘Tamagotchi effect’. The concept has since evolved to include any emotional entanglement we experience with ‘machines, robots or software agents’. While it ostensibly refers to digital beings, it’s worth remembering that we, as humans, can form emotional attachments to pretty much anything. I dropped a chocolate bar the other day and cried.

Cuddles and care

Perhaps the most powerful analogue object we may have experienced some sort of relationship drama with, is the first ‘pet’ of our childhoods: the cuddly toy. As Alain de Botton, (my favourite go-to-person for when something doesn’t quite compute emotionally) explains, soft toys are an important (and safe) projection of self for us when we’re growing up. Are digital pets just an extension of this? There’s one key difference: digital pets need us to look after them; they implore us to care.

The next digital pet after the Tamagotchi was another Bandai creation: Digimon (short for digital monster). In contrast to the unisex appeal of Tamagotchi, Digimon was explicitly marketed at boys. Why? Because you could connect with your friend’s Digimon and fight. And they were more… square. I had a Digimon, but I don’t remember playing with it much, probably because I was a girl and liked bevelled edges.

Digimon commercial. Warning: for boys only

It didn’t take long for rival toy company, Tiger Electronics, to bring out the cheaper Tamagotchi competitor Gigapet, which I didn’t want because I already had a Tamagotchi and a square Digimon. By late 1997, Bandai had already released a second generation Tamagotchi, with more things you could do, more things it could do, and just more general do-ness. But my attention had already returned to an older digital pet, which I had been rearing on our family PC, long before my first Tamagotchi lived and died.

In 1995, San Francisco-based developer PF Magic released the PC game, Petz (I questioned the ‘z’ back then, too). Petz – which came in the form of ‘Dogz’ and ‘Catz’ -were like Tamagotchi which lived on your home computer rather than in your pocket. You could adopt a puppy or kitten from a number of breeds, all with different personalities, which were in turn shaped by how you interacted with them. Simulating a degree of naturalism was key to the experience. Petz were designed ‘to be highly believable synthetic agents’ aided in large part by the blurring of reality and virtuality: Petz could strut across your Desktop, leave their toys scattered across your Word document – even the mouse cursor was designed as a hand.

PF Magic’s demonstration video for Petz, a game well ahead of its time.

Whilst Petz did age, progressing from infants to adolescents to adults, they then stayed as adults indefinitely. In making a digital pet as ‘real’ as possible, the developers had abandoned the only given in life: death.

The Tamagotchi effect notwithstanding, I remember the two reasons I became devoted to my Petz. Firstly you could make two Petz breed by spraying your selected duo with frankly toxic levels of love potion. Their progeny, through some algorithmic magic, would emerge as endless iterations of the combined feature sets of their parents. Infinite fun to a kid who was still a long way from the bike sheds. Secondly, Dogz 3 came out at the same time as dial-up internet arrived, giving me my first contact with online communities. Hello fandom. A huge online sub-culture grew up around the game, and I soon learned it was possible to modify (or ‘hex’) my Petz to have anything from blue spots to bat wings to arthropod antennae.

Both these reasons gave me a real sense of ownership and connection to my Petz through customisation. Realism didn’t matter; this was about making something my own and bringing it to life; what kid is going to say no to playing God?

hexed-dogz

Imagez of hexed Dogz and Catz.

After the success of Petz, PF Magic went on to release Hamsterz, Horsez (even they regretted the ‘z’ at this point), and the super-creepy Babiez, which I never played because the disk was corrupted. To be honest, I’m glad, because babies and breeding is a virtual red line for me. But Petz was the progenitor of a strong line of PC-based digital pets and simulations. Other notables of this pedigree are the cult-classic Creatures, and the big Daddy of them all, The Sims, which also later spawned its own imaginatively titled pet-based expansion pack (brace yourself) The Sims: Pets.

It took Japanese console wizards Nintendo to push the evolution of screen-based digital pets forward via some crossbreeding with their hand-held littermates. The result was the ultra-cute Nintendogs, which launched with Nintendo’s handheld DS in 2005. Nintendogs was an instant hit with a new generation of kids to whom Tamagotchi were either a new Pokémon or a strand of avian flu. Advances in computer graphics pushed realism further than in any previous digital pet game. There were also new interactive experiences: these sweet little eternal-puppies could respond to and learn voice commands, while you scratched their bellies with your stylus – if you still had one.

The Nintendogs franchise is still going strong: Japanese commercial for Nintendogs & Cats, launched alongside the 3DS in 2011.

On booting up my copy of Nintendogs: Chihuahua & Friends (for research purposes of course), I discovered I’d called my pup Randolph. I think this may have been my Citizen Kane phase; I was 21 in 2005, and still buying (digital pet) games; the bike shed had become the student bar, and I wasn’t exactly a regular. Nintendogs didn’t grip me in the way earlier digital pet games had, possibly because I was not about to shout “RANDOLPH, SIT!” on the bus to a lecture. But at this point even I can admit to no longer being the target audience. Nintendogs resonated with a new generation by bringing the portability of Tamagotchi together with the processing power and complexity of earlier screen-based iterations. Digital pets were here to stay, “I SAID STAY, RANDOLPH!”

Like the Petz series, Nintendogs was more of a simulation than a game in the traditional sense. While it made use of the DS’s internal clock to track the passage of time, the realism stakes were set pretty low: if you neglected Randolph for a few days, you wouldn’t return to find he had committed suicide next to his snow globe, but merely run away off-screen. A tap of the stylus would be enough to bring him gambolling back bearing a gift in the hope that you’d start treating him better. This sent out the message to kids everywhere that abuse is totally forgivable, and comes with bonus presents for the abuser.

The cybernetic tortoise

The original digital pets came from somewhere much more sentimental. The story goes that Aki Maita came up with the idea for Tamagotchi while watching a TV commercial in which a boy is forbidden from taking his pet turtle to school. She wondered whether she could create a digital pet for him to take along instead. With this in mind, when I asked Andrew Nahum, Keeper Emeritus at the Science Museum (who contributed to their latest book Robots, and is currently writing a chapter on cybernetics) his thoughts on the possible origins of digital pets, his answer made me smile: “Grey Walter’s Tortoises”.

Grey Walter was a British neurophysiologist who in 1948, as a pet (ba dum tss!) project, began building what are widely considered to be the first ‘scientifically significant robots’. Grey’s ‘tortoises’ (as Grey called them) were autonomous robots which used two inbuilt sensors to navigate their environment in an attempt to model the brain. As Andrew explains, “they pioneered lots of the features of later digital pets – sensitivity to touch and light, and quite crude circuitry and algorithms producing surprisingly lifelike behaviour. So I guess they can be thought of as the prototypes for later digital and robotic pets.”

Contemporary news item about Grey’s Tortoises. Grey and his wife smile like proud parents.

The model of the tortoise may not be mere coincidence. Grey set out to make what he called a model animal, a ‘sim’ in all but name. He called (and therefore gendered) his two earliest models Elmer and Elsie, building kennels for them to charge up in, which only added to the sense of them being a kind of pet. Grey himself shamelessly anthropomorphised these devices. ‘She is taking her bottle,’ he told a visitor, when Elsie entered the kennel to re-charge. Andrew adds, “I don’t think Grey Walter would have minded them being called pets, but his intention was more serious.”

grey-walter-tortoise-01

Grey Walter’s Tortoise. Photo by Rain Rabbit on Flickr

Staring at Elsie now in her glass box in the Science Museum, it seems surprising that people in 1949 who saw her deftly navigating obstacles thought that she really was sentient; even Grey believed his tortoises demonstrated self-awareness. “I guess our expectations go up all the time,” Andrew muses. But perhaps the emotional response Elsie elicited in her owner and the general public alike is the real link between her and her digital descendants: the ‘Elsie effect’. I’m left wondering if a kid today playing with a Tamagotchi would find it less beguiling knowing there are ‘better’ versions out there?

Modern-day kids react to the original Tamagotchi

In parallel to the explosion of hand-held and screen-based digital pets in the mid-90s, came a new era of their robotic cousins (and Elsie’s direct descendants). Top of every kid’s Christmas list in 1998? The ubiquitous and rather creepy bug-eyed, bird-monstrosity, otherwise known as a Furby.

Get ready to hide behind a sofa: Furby commercial.

Furbies could move their eyes, mouths and ears, as well as jiggle around a bit, but the real reason you wanted one was because they could learn to talk. Initially they spoke ‘Furbish’, but the longer you spent with them, the more English they learned. Not only that, they could chat to other Furbies too. I remember mine spontaneously chattering away to itself in in the middle of the night, then triggering my brother’s one in the adjoining room, and it all ending up a bit Rosemary’s Baby.

Furby transpired to be a black box of over-hyped mystery, but that was part of the allure: I was convinced there was more going on than met the eye. Instead of there being disappointment in the gap left between expectation and reality, Furby managed to create even more expectation; their PR team must have been laughing at our collective gullibility, possibly via the secret camera mounted inside each Furby.

My most memorable experience of robotic pets of this era was Aibo, Sony’s iconic robot dog. Aibo was a serious robot pet, with an even more serious price tag, which meant that it wasn’t just for kids. My Dad made business trips to China, so we were lucky enough to find an Aibo under our Christmas tree in 1999. They say ‘a dog isn’t just for Christmas’ but our Aibo was just that: it came out of its box exactly three times before being deemed too much of a pain to get going. It has been sitting in my Mum’s basement ever since.

Pet cemetery

This brings us back to death. In 2006, after producing numerous generations of Aibo, Sony ended production. And in 2014, they stopped customer support altogether, including maintenance. Some devoted owners, experiencing the Tamagotchi effect turned up to 11, will now go to any lengths to keep their beloved pets alive. A lucrative second-hand parts market has emerged, and I am now seeing my Mum’s basement in a very different light.

A short documentary about Aibo Owners affected by Sony’s decision to end manufacturing for replacement parts.

Digital pets themselves have undergone something of a virtual (no pun intended) sanitisation since the days of Tamagotchi. Death, at least as part of the simulated or game experience, has all but disappeared. Nintendogs never age, staying as puppies indefinitely  and in the 2011 Xbox One Kinnect game, Kinnectimals, the pets remain cubs forever. Overly-cute – or ‘kawaii’ – character design seems to dominate every app-based digital pet game currently on the market. Even the mega-successful ‘retro’ virtual pet app Hatchi, cannot die.

Gallery: contemporary pet games

Returning to a life-cycle with death as the end-game, I can now bring my personal history of digital pets up to date. I am developing a game with Ellie Silkstone of indie games studio Xylophone Games in response to the Wellcome Collection exhibition Making Nature. Our game, P.E.T., updates the digital pet genre, and was partly inspired by the curious story of the Facebook game Pet Society.

Launched in 2008, Pet Society was one of the most popular Facebook games of its time, with hundreds of thousands of users across the globe. Players could name, create, and look after pets, and share them with other users. But by 2013, the game’s owners EA decided that with profits falling, it was time to pull the plug.

While death for an Aibo was a physical thing (ie breaking) here was the definite and sudden end of something totally digital. Pet Society owners the world over were left grieving but also furious: they mobilised and lobbied EA to release the data and return their beloved pets. To this day, they still haven’t got them back, and as of writing, the (very polite) Facebook group, ‘Please Save Pet Society’ still has nearly 11,000 owners pining for their lost pets. Digital death has come full circle, not part of the game, but as an unforeseen consequence of a commercial imperative.

pet-society-comment

A ‘reason’ comment left on the Change.org ‘Save Pet Society From Closure’ petition page.

And to bring things even more up to date, the Ozimals brand of ‘breedable’ rabbits in the sim of all sims Second Life, was recently hit with a cease and desist order. The databases, which allowed the bunnies to eat – and Ozimals to keep on profiting – would cease to function as of May 17 2017. The result? All bunnies in-game would starve to death, or as the owner of Ozimals so gently put it, ‘hibernate’.

Ellie and I found this tension between real and digital death fascinating. And so P.E.T. was born, a pet simulation exploring the digital life – and death – of a virtual pet. In steering that pet through its life-cycle, the game explores the ethical implications around the customisation and commodification of pets, alongside the player’s shifting relationship to something which makes no (digital) bones about the fact it isn’t real.

When I talked about P.E.T. to the philosopher and social epistemologist, Professor Steve Fuller of Warwick University, he was very interested in how the game could bridge the gap between our understanding of the rights of animals and the rights of robots. But the bigger question P.E.T. asks is not just about our responsibility to our digital companion animals, but about the very real ethical responsibility of the company that creates them to us.

japanese-girlfriend

In Japan, plummeting birth rates are being linked to the popularity of digital ‘pet’ girlfriends with young men.

As part of my research for P.E.T., I exhumed my old Tamagotchi. With a new battery installed, its digital resurrection was complete. Within minutes my egg hatched, and a little black blob of embryonic pixels was bouncing around the screen. I quickly fell under its spell, responding to its every beep-beep-beep-beep-beep-beep need. A day later, it evolved into a larger white blob of pixels, and two days after that, into a rather ugly head with a beak, and I still diligently attended to its every need, even as my patience wore thin, particularly with the ‘guess which way I’m going to turn’ game.

During a day of meetings, and beep-beep-beep-be I quickly realised I had to mute my pet. Without audible reminders, my Tama had a patchy day of care, and when its next evolution took place, it became one of the ‘disobedient’ characters, a sort of duck head with legs. Fair enough, I thought. The next day was even worse, and when I got home, after I had emailed, cooked, washed up, got angry at someone on Twitter and Question Time simultaneously, and was about to fall asleep, I suddenly remembered I’d not checked my Tamagotchi all day. All day. It was midnight and this is what I found:

tamagotchi-abuse

I can barely look: my neglected Tama.

That is a sleeping Tamagotchi surrounded by no less than four of its own excretions; it may have done more off screen for all I know. After a Tamagotchi falls asleep, you have to wait for it to wake up to clean it up. It wouldn’t wake up for another ten hours.

I wasn’t expecting it, but as I looked down at this pitiful creature in my hand, I suddenly felt a pang of guilt. I’d done that.

The next day was even busier, and I forgot about my Tamagotchi once again.

When I next remembered to check, I found the sparkly three-eyed alien angel had been.

My Tamagotchi was dead.

I felt destroyed.

Research Notes: a chance discovery

When library research gives Wendy Moore “sweaty palms, a thumping heartbeat and fluttering in the stomach”, it’s a surefire sign she’s found the subject of her next book.

We think of libraries as places of quiet solitude where information is reassuringly organised, ordered and catalogued. Yet for me one of the best things about libraries is serendipity. It’s that unexpected jewel of information which changes everything, that missing jigsaw piece that solves a puzzle or – in my case – the chance discovery which led to my next book.

L0006638 Dickens giving the last reading of his Works.

Charles Dickens on a lecture tour. Illustrated London News, 1870. Image credit: Wellcome Library.

That serendipitous moment came on a cold January day in 2012 when I arrived at the Wellcome Library to do some research for a column I then wrote in the British Medical Journal. Since it was the bicentenary of Dickens’ birth, I had decided to write about the doctors in Dickens’ novels. I duly read about Sir Tumley Snuffim, Dr Kutankumagen, Mr Slasher and the various bumbling surgeons and grasping physicians with whom Dickens peppered his works. But it was a doctor who never appeared in a Dickens book, a doctor who became the author’s medical advisor and one of his closest friends, who transfixed me.

L0000855 Title page of

Title page of pamphlet published by  E. Hancock, 1842. Image credit: Wellcome Library.

I chanced upon John Elliotson in a short book, Dickens’s Doctors by David Waldron Smithers (1979), on the Library shelves. And as I read about this complex man, who was born into the Georgian era but became one of Victorian society’s most intriguing personalities, I recognised those unmistakeable symptoms – sweaty palms, thumping heartbeat and fluttering in the stomach – which are the surefire signs of a possible new book.

A quick check in the catalogue revealed there were numerous works on the topic of mesmerism but no biography of Elliotson. A search in the archives uncovered a treasure trove of curiously-titled publications by and about him. What on earth was The Zoist, which he edited for 13 years? What induced him to publish a pamphlet entitled False accusation in the Royal Medical and Chirurgical Society against a poor man because he suffered no pain while his leg was amputated in the mesmeric coma? And why had he inspired a pamphlet with the even stranger title A full discovery of the strange practices of Dr Elliotson on the bodies of his female patients? I was hooked.

Elliotson, I discovered, was the son of a prosperous chemist who determined in his early teens to become a physician. He gained not one but two medical degrees – from Edinburgh and Cambridge – trained at Guy’s and St Thomas’ Hospitals, and established himself as one of the capital’s most respected practitioners. By 1837 Elliotson was professor of medicine at University College London and physician at University College Hospital with a grand house and consulting rooms in the West End. And then he risked it all – by embracing a controversial new idea: mesmerism.

L0018560 Portrait of John Elliotson

Portrait of John Elliotson. Lithograph by R Martin. Image credit: Wellcome Library.

Elliotson was no stranger to novelty or controversy. As an up-and-coming doctor he had discarded the traditional physician’s breeches for newly fashionable trousers and sported whiskers according to the latest trend. This 19th century hipster was an early adopter of medical innovations too – especially when they emanated from the Continent. He championed the stethoscope, dabbled with acupuncture and became one of Britain’s foremost advocates of phrenology. So when a French disciple of mesmerism, Baron Jules Dupotet de Sennevoy, arrived in London in June 1837 to promote his doctrine, Elliotson was first in line.

V0011094 A practictioner of Mesmerism using Animal Magnetism

A mesmerist using animal magnetism on a seated female patient. Wood engraving, ca. 1845 [French Newspaper]Image credit: Wellcome Library.

Mesmerism – effectively hypnotism – had been named after German physician Franz Anton Mesmer who discovered in the late 18th century that he could induce a kind of sleep in his patients through repetitive hand motions and vocal suggestions. In this state, people slavishly followed his commands, lost their inhibitions and became insensitive to pain. Mesmerism garnered supporters throughout Europe but had failed to find a following in Britain – until Elliotson took an interest.

V0001756 John Elliotson. Lithograph after J. Ramsay.

John Elliotson. Lithograph after J. Ramsay. Image credit: Wellcome Library.

Elliotson invited Dupotet to try his technique on some patients at UCH and, impressed by the results, he adopted the method himself. Over the next 18 months Elliotson mesmerised dozens of patients at UCH and he invited friends, colleagues and journalists to witness his experiments. One of his guests was the young Dickens who made the short journey to UCH from his house in Doughty Street with his artist friend George Cruikshank on 4 January 1838. Elliotson’s casebooks, in the UCL archives, record their visit. Dickens was so bewitched by what he saw that he learned the technique himself – later practising on his wife Catherine and others – and became a lifelong enthusiast for mesmerism. Through his friendship with Dickens, Elliotson became the medical darling of the Victorian literary world.

Elliotson went on to stage dramatic demonstrations of mesmerism in the lecture theatre of UCH which drew large crowds and sparked sensational headlines. In particular, spectators were entranced by the Okey sisters, 17-year-old Elizabeth and 15-year-old Jane. Admitted to UCH for epilepsy, their names were routinely misspelled O’Key, and writers ever since have assumed the girls were Irish. In fact, genealogical records show, they came from a working-class English family in nearby Somers Town. Under mesmerism these two demure little misses were transformed into precocious little minxes who joked and flirted with their audiences as well as lifting heavy weights and withstanding electric shocks.

Ultimately, however, Elliotson’s love of novelty was his downfall. Not content with showing that mesmerism could improve certain conditions and banish pain, he was bent on witnessing bizarre phenomena reported from the Continent where doctors claimed patients could forecast the future, diagnose other patients’ ailments and distinguish ‘mesmerised’ water and metal. When The Lancet demolished the latter theory in exhaustive tests on the Okeys, Elliotson’s reputation was severely tarnished. But he sealed his own fate by taking Elizabeth Okey to a ward where she predicted two patients would shortly die. UCL promptly banned mesmerism and he had no option but to resign.

MESMERIST

Image credit: Wendy Moore / Weidenfeld & Nicolson.

His reputation in tatters, Elliotson stayed faithful to mesmerism. He staged demonstrations in his own home, which were gleefully described in the anonymous pamphlet A Full Discovery. He founded the London Mesmeric Infirmary to offer mesmerism to poor patients; its sixth annual report survives in the Wellcome Library archives. And he launched his own journal, The Zoist, which ran to 13 volumes, to publicise mesmeric treatment and operations painlessly performed under mesmerism more than four years before chemical anaesthesia.

Chasing Elliotson took me on a three-year detective trail through the archives of UCL, the Royal Society of Medicine, Dr Williams’s Library and beyond. But it all sprang from a moment of serendipity in Euston Road.

Wendy is giving a talk about her new book, the Mesmerist, at Wellcome Collection on Thursday 29 June 2017.

The tradesman who confronted the pestilence

The City of London, 1665. As the Great Plague hits the capital, John New faces a deadly dilemma.

V0010611 Victims of the plague in 1665 being lifted on to death carts

Victims of the 1665 plague are lifted on to a death cart, Samuel Wale, 1747.

On 6 August 1665, John New, a weaver in the bustling London parish of Cripplegate, watched his son and daughter die from what local broadsheets were already calling “The Great Plague”. The latest outbreak of this fearful disease, and the efforts taken to control its spread through one of Europe’s largest commercial centres, struck weavers like New, as well as other artisans and tradespeople, particularly hard.

Hours or days before, New would have spotted painful black swellings, known as buboes, around the groins, armpits or necks of his children. Such was the fear of plague spreading through the city that symptoms like this had to be reported to the authorities within just two hours. The children would have soon experienced headaches, vomiting and pain so intense that many victims were overcome with frenzy. Their swellings would have turned red, purple or black and may have ruptured to form open sores. The only mercy for New and his family was that death – a near certainty – came quickly.

Despite published precautions and remedies, the only sure way to avoid this fate – which appeared to befall rich and poor alike – was to leave the city. New would have had plenty of time to consider this course of action. It’s possible he would have heard about the spreading “contagion” two months before, when the first weaver in Cripplegate died from the disease and the city’s playhouses closed. A month later the Lord Mayor issued orders designed to control the infection and, by the middle of June, New couldn’t have failed to notice that Londoners were fleeing the city in droves.

“So homewards and to the Cross Keys at Cripplegate, where I find all the towne almost going out of towne, the coaches and waggons being all full of people going into the country.”
Samuel Pepys in his diary entry of 21 June 1665

People fleeing the plague, first printed 1630.

Even after a hundred local people involved in his trade had died, and the Bishop of London had announced that “many thousands of poore Artisans” were on the brink of starvation due to lack of business, New remained in the pestilential city. In fact, though diarist Samuel Pepys claimed “all the towne” was leaving, tradespeople like New, and the city’s poorest inhabitants, had little option but to stay.

Gallery: scenes from the Great Plague

The well-to-do, like the poet John Dryden – who retired to his father-in-law’s estate in the West Country – and many of London’s physicians, had the means to pay for transport out of the sick city. They left their servants to watch over their expensive properties and were easily able to support themselves for an extended stay away from London.

New would have struggled to find the five-shilling fare for the cheapest waggon out of town. He also would have risked losing his livelihood if he left, since his home (which may have doubled as his place of work) might be ransacked in his absence, resulting in the loss of both goods and tools. Leaving the city would have removed any chance of keeping his business alive. However, staying wouldn’t do much for trade, due to controls placed on the movement of goods and people both before and during the outbreak.

Two years earlier the Privy Council – a small group of men who governed England under King Charles II – had taken action when the plague arrived in Amsterdam. The Royal Navy had intercepted ships in the Thames estuary and asked any vessels from Amsterdam to either turn back or anchor at the remote Canvey Island. There, cargoes were unloaded and aired for 30 days. Crew and passengers had to stay aboard for the same amount of time.

In the early summer of 1664, as deaths in Amsterdam rose, ships from all Dutch ports were held outside London for 40 days, a period known as ‘quarantine’, from the Italian word for 40. Though measures to separate the sick from the healthy have been used to limit – with varying levels of impact – the spread of infectious disease in many countries and periods, London’s approach to plague control was based on a concept first developed in Italy.

Gallery: quarantine procedures and structures

Growing out of the thinking of Florentine scholars and lawyers in the 1400s, the ‘ideology of order’ viewed society as a living organism that needs hierarchy and stability to survive. City authorities could therefore justify imposing draconian measures, which disrupted the lives of ordinary people, during periods of crisis such as epidemics.

The Thames shipping controls appeared to work – at first. During 1664 over 24,000 people died from plague in Amsterdam, while only five cases were reported in London. When the disease finally hit the English city, the Lord Mayor’s plague orders meant anyone wanting to leave required a health certificate. Those with the money to bribe officials would find it easier to obtain one than tradespeople like New.

L0046086 Orders conceived and published

Plague orders published by the Lord Mayor and Aldermen of the City of London, 1665.

The orders required householders to clean the street outside their home every day and insisted that burials only take place at night – with neither family nor friends present. No dogs, cats, rabbits or pigeons were allowed in the city, and businesses that involved travel or gatherings of people had to stop operating. Ale- and coffee-houses closed down, as did markets, theatres and other social spaces. Street sellers and hawkers were prohibited from selling any goods and all trade in second-hand clothes was banned.

Across the city, official examiners and searchers were appointed “to inquire and learn” which homes had been “visited” by illness and to identify those affected by plague. These houses were then shut up for 28 days, complete with all their inhabitants – whether infected or not. Plague homes were marked by a foot-high red cross on their doors and the words “Lord have mercy on us”. Watchmen supplied the inhabitants with food, while ensuring none of them left the property. Nurses could be provided at the cost of the parish.

“A band of Halberts mustred were, to guard –
The People from the Plague in evry Ward
And if they found by making inquisition
(Or had but any probable suspition)
Where lodgd it had (although but for a night)
That man was baisht from the publick sight,
Imprisned in his house both night and day,
As one that meant the Citie to betray.”
George Wither in his epic poem about an earlier plague outbreak, The History of the Pestilence, 1625

The practice of shutting up the healthy alongside the sick was unique to England. Local writer Daniel Defoe, aged just six at the time, later described “the confining the Sound in the same House with the Sick” as “a great Subject of Discontent”. Unsurprisingly the practice prompted many complaints. Shutting up was variously thought to cause more disease, bring on a melancholy that made one more susceptible to it, obstruct trade and add to the demands on parish budgets, which now had to support both the well and the sick.

“This shutting up would breed a Plague if there were none: Infection may have killed its thousands, but shutting up hath killed its ten thousands.”
Anonymous, The Shutting Up Infected Houses as it is Practised in England Soberly Debated, 1665

What officials attempting to stop the spread of plague didn’t know at the time was that it is very rarely transmitted from human to human. Plague is caused by a bacterium that usually infects small rodents, including black rats. When one of these rats dies, fleas feeding on its blood will move to the next best thing – in a shut up house, this could be the nearest human, who might otherwise have been spared.

V0020711 A black rat sitting upright on the ground. Etching by W. S.

A black rat, William Samuel Howitt, 1808.

As the number of infected people in the city grew, it became difficult to police the shutting up of houses. By the time New’s children became ill, anyone who did not comply with the plague orders could be legally restrained, though some ‘inmates’ still managed to escape. Meanwhile, many of those in the upper echelons of society simply ignored the rules. Neighbours of the Lord Mayor himself were arrested when they tried to shut up his house, which became infected in September.

Even citizens free to move around the deserted city changed their behaviour. People walked in the middle of the street to avoid contact with others, only entered shops if no other customers were present and preferred not to handle money. All this clearly affected trade. Those involved in cloth making were, perhaps, worst affected, since plague was thought to be transmitted by the movement of fabrics. In hindsight this has some truth; fleas can survive for several weeks outside a living host and might easily be moved around – and out of – the city within bundles of cloth.

“All Trades being stopt, Employment ceased; the Labour, and by that, the Bread of the Poor were cut off… These might be said to perish, not by the Infection it self, but by the Consequence of it; indeed, namely, by Hunger and Distress, and the Want of all Things; being without Lodging, without Money, without Friends, without Means to get their Bread.”
Daniel Defoe, A Journal of the Plague Year, 1722

In this climate, John New’s life would have been strained even without the pain of losing his children. As it was, he didn’t have to struggle through the worst days of the crisis, which arrived later in August and September. New died just three days after his children, one of an estimated 100,000 people who perished before the end of October, when cooler weather arrived and London’s most recent epidemic subsided.

Electric Age: condenser couch

Proponents of this high-frequency electric couch claimed that it could cure everything from mental illness to kidney dysfunction.

Wehnelt Interruptor

This couch, featured in an 1899 manual of therapeutic electricity, delivered high frequency currents to the patient. Image credit: Wellcome Library.

When we visit a doctor, of body or mind, the couch is where we place ourselves at rest while the healer does their work. It may be for examination, massage or simply talking: there is perhaps no more potent symbol of medical furniture than Sigmund Freud’s couch, bedecked with richly-decorated cushions and rugs. The couch itself rarely comes into the treatment process. Make way for an exception: the condenser couch, upon which one could lie and be cured by the couch itself of ailments as varied as kidney dysfunction and painful fissures of the anus. A couch that, through the magic of electricity, did the work of the doctor.

The condenser couch was an appliance associated with early electrotherapy, a means of delivering high frequency or ‘oscillating’ current to a patient. Rather than the continually-generated alternating current we know as mains electricity, high frequency electrotherapy relied on the controlled release of current from a condenser, a device for storing a temporary electric charge (like a Leyden jar or a modern capacitor). With the patient lying on the couch “the current passes to the patient either by a handle of bare metal held in the hand, or it may be in the form of an electrode applied upon any desired part of the body.”

Couch for condensation

This ‘couch for condensation’ featured in a 1913 practical handbook of medical electricity. Image credit: Wellcome Library.

This technique relied on the pioneering work of French electrotherapist Jacques-Arsène d’Arsonval, who discovered that an alternating current could deliver beneficial effects to skin and muscle tissue, rather than the damage of an electric shock. Some of this may have been due to the diathermic effect of the electric current heating the tissue. But in the Electric Age the cure often pursued the ailment, and the various benefits of high-frequency electric therapy were also vaunted as localised analgesia, reduced arterial tension, and even diminishing “the virulence of blue pus”.

Effects on the mind were also noted. A 1913 manual of ‘Natural Therapy’ suggests that while the efficacy of high-frequency electricity was probably overstated, it might nevertheless have valuable psychological effects. There is “no sufficient reason why we should refuse to employ it in the treatment of the hysterical and neurotic who declare they feel benefited by a séance,” declared the authors. In Vienna, Freud’s couch was already well-established as a haven for hysterical neurotics by this time, but the bare and fussy wrought iron lines seen in illustrations of the condenser couch speak of an altogether less comfortable experience.

Bergonic chair

The Bergonic chair was used in the WWI era for giving general electric treatment in psycho-neurotic cases. Image credit: Otis Historical Archives National Museum of Health and Medicine.

The condenser couch never became a well-known brand, never featured in fashionable department store windows or high-profile legal battles. The association between electricity and mental health took an altogether more violent turn in the 1930s with the introduction of electroconvulsive therapy or ECT. The mysterious ‘Bergonic chair’, glimpsed during World War I might form a missing link between the couch and shock therapy.

Today, diathermy remains a mainstream medical treatment. The use of high frequency electricity is more commonly found in beauty treatments, and low current, high voltage violet wands sell well to the BDSM community. But if you encounter an electric couch in a medical setting these days, it’s likely to be one that easily adjusts to your height and for the convenience of the doctor.