Redefining Death Through The Ages
Technological advances and ethical questions have forced societies to constantly redraw the thin line between life and death. (Happy Dia de Muertos!)
Even the ancient Greeks had already understood that somebody who seemed dead could in fact be alive. Therefore cutting off the finger of a deceased was a common practice before proceeding to cremation.
In medieval Europe, on the other hand, accidentally burying people alive happened quite often. At that time, death was the order of the day. Populations were decimated by wars and epidemics (the 14th century ‘Black Death’ plague killed an estimated one-third of Europe’s inhabitants). On top of that, medical knowledge was still quite rudimentary.
Niccolo Massa, an Italian anatomist and author of one of the first anatomy books, published in 1536, was so afraid of being buried alive that he stipulated that he should not be consigned to the grave no sooner than two days following his death.
Between the years 1500-1700, so-called anatomy theaters were popular entertainment, featuring shows that starred "physicians" making public autopsies on executed prisoners. Unfortunately, sometimes the deceased turned out to be only half-dead.
The medicalization of death began in the 18th century when doctors began to assist patients towards death, for example, administrating them analgesic opiates.
Cessation of respiration and circulation was for a long time considered the ultimate sign of dead. Time of death was designation as the last heartbeat heard (cor ultimum moriens, Latin for ‘heart is last to die’).
[rebelmouse-image 27087451 alt="""" original_size="400x600" expand=1]
In Mexico, and elsewhere, Nov. 2 is "Day of the Dead." (Tomascastelazo)
Various methods served to check breathing, such as placing a feather or the flame of a burning candle under the nose. Another famous practice consisted in holding a small mirror against the mouth or nostrils. Some placed a glass of water on the chest to see whether the surface would ripple. A more extreme way was to cut the veins to examine if the blood would flow away rhythmically.
What the brain tells us
As time passed, and medicine discovered new notions, it became less clear where to mark death. Technological innovations helped to stave off the end. People learned how to “revive” using artificial respiration, smelling salts or defibrillators.
At the beginning of the 20th century, physicians acknowledged already the phenomenon of clinical death in which the heart and lungs stop functioning, but the brain remains active. Ever since, the cessation of circulation and breathing, loss of reflexes, paleness of the body or low temperature were not enough to determine one’s death. The law imposed an obligation to wait 12 hours before removing the body from the place of death. The burial could happen only 24 hours after the person passed away.
Still, even until the 1960s, physicians dropped hot wax on the patient’s skin to see if it provoked any redness (which would mean the person was alive) or tied a string around a finger to see if any swelling occurred. Injection of colorants, like the blue fluorescent, was also practiced. If the blood circulated, skin and even eyeballs got stained.
For the last 4,000 years we have learned that the signs of death may be misleading. Nowadays, we revive people who fifty years ago would have been considered lost. But on the other hand, a breathing person can today be declared dead.
The inception of organ transplants created a necessity to redefine death once again. Only living organs may be transplanted, and yet the donor must be dead. Today, we have settled on cerebral life as the current fine line separating us from the grave.