The Birth of The Sanitation Movement from “Dirty Old London” to NYC.
Posted on | April 10, 2025 | No Comments
Mike Magee
I. Florence Nightengale and Sidney Herbert
Order had always been part of Florence Nightingale’s life. Her father was William Edward Shore, a Country Squire, who at age 21 inherited his rich uncle’s huge fortune as well as his name. On his death, the younger (now) Nightingale, seamlessly managed the profits of the family’s lead smelting business, as well as, not one, but two named estates – the 1300 acre Lea Hurst in Derbyshire, and the equally impressive Embry Park in Hampshire.
He and his wife had two daughters, each named after the Italian cities where they were born while vacationing. Parthenope carried the Greek name for Naples, and Florence arrived on May 12, 1820, one year later 300 miles north in the “Birthplace of the Renaissance.” The family was well connected with members of Parliament, none closer than Baron Sidney Herbert. It was he who the family turned to for reassurance and guidance when Florence declared at age 16 that her life’s work would be nursing the sick and ill in the service of the Lord.
This was quite a surprise to her father who had taken special care to see that she was classically trained in Greek, Latin, French, German, Philosophy and Religion. But in Florence’s words, she “craved for some regular occupation, for something worth doing instead of frittering away time on useless trifles.” To do so, she was willing to decline suitors and her mother and older sister’s life of comfort and philanthropy.
Sidney Herbert was willing to support the strong willed woman, 10 years his younger, carrying her along on fact-finding trips to Egypt and beyond. None of it shook her commitment. Her intent was clear when she noted in her diary, “On February 7, 1837, God spoke to me and called me to his service.” The calling was specific – nursing the ill in institutional settings. As luck would have it, her goal aligned well with the voluntary efforts of Sidney’s wife, the Lady Elizabeth Mary Herbert, a prodigious fund raiser.
Queen Victoria assumed the throne of England following her Uncle’s death just 4 months after Florence’s religious awakening. The two were born exactly 1 year and 12 days apart. Both the young Queen and her husband, Albert, were military enthusiasts, and saw themselves as active participants in the nation’s armed conflicts. In September, 1854, they had a front row seat in a simmering conflict between the Ottoman Empire and Russia. Britain and France had thought they had brokered a deal between the two primary combatants when the truce fell apart.
Russian Emperor, Nicholas I, then ordered the invasion of what is now current day Romania in July of 1853. By January, 1854, the British and French fleets had entered the Black Sea, and the war was on. Britain’s Prime Minister, Lord Palmerston, suggested the effort was preventive. As he said in words that ring true today, “The main and real object of the war is to curb the aggressive ambition of Russia.”
Military historians would later document: “The Crimean War is largely forgotten now, but its impact was momentous. It killed 900,000 combatants; introduced artillery and modern war correspondents to conflict zones; strengthened the British Empire; weakened Russia; and cast Crimea as a pawn among the great powers.”
At the time, Queen Victoria leaned heavily on her new Secretary of War, Sidney Herbert, and Florence Nightingale saw a once in a lifetime opportunity for clinical experience and seized it. With Herbert’s endorsement, she and the 38 nurses in her charge arrived at Barrack Hospital at Scutari, outside modern day Istanbul, ill prepared for the disaster that awaited them. Cholera, dysentery and frostbite – rather than battle wounds – were rampant in the cold, damp, and filthy halls.
During that first winter, 42% of her patients perished, leaving over 4000 dead, the vast majority absent any battle wounds. Florence later described her work setting as “slaughter houses.” Their enemy wasn’t bullets or bayonets, but disease -typhus, cholera and typhoid fever. Over 16,000 British soldiers died, 13,000 from disease.
Nightingale’s initial assessment was that warmer clothing and food would stem the tide. Going over the heads of medical leadership, she did what she could, and leaked details of what she was observing to Herbert and journalists who, for the first time, were stationed within the war zone. In the process, she became a celebrity in her own right, and as Spring of 1855 approached, a first ever Sanitary Commission was sent to the war zone and Victoria and Albert themselves, with two of their children, visited the area.
Famed artist Jerry Barrett made hasty sketches of what would become The Mission of Mercy: Florence Nightingale, which hangs to this day in the National Gallery in London. During this same period, the first lines of Henry Wadsworth Longfellow’s poem, Santa Filomena, would take shape, including “Lo! In that house of misery, A lady with a lamp I see, Pass through the glimmering of gloom, And flit from room to room.” And the legend of Nightingale, the actual “Lady of the Lamp”, appeared on the front page of the Illustrated London News complete with etched images.
It read: “She is a ministering angel without any exaggeration in these hospitals, and as her slender form glides gently along each corridor, every poor fellow’s face softens with gratitude at the sight of her.”
What became clear to Florence and others was that infection and lack of sanitation were the culprits, and corrective actions on the facilities themselves, along with sanitary practices that Nightingale led, caused the subsequent mortality rates to drop to 2%. By the time the war drew to a halt in February, 1856, 900,000 men had died. Florence Nightingale remained for four more months, arriving home without fanfare on July 15, 1856. Thanks to the first ever war correspondents, she was now the 2nd most famous woman in Britain, after Queen Victoria.
While she was away, the Herberts raised money from rich friends. The Nightingale Fund now had 44,000 pounds in reserve. These would help fund a hospital training school and her famous book, “Notes on Nursing” in future years. Though her re-entry was quiet and reserved, she had plenty to say, and committed most of it to writing. While in Crimea, she had written to Britains top statistician, Dr. William Farr. He had replied, “Dear Miss Nightingale. I have read with much profit your admirable observations. It is like light shining in a dark place. You must when you have completed your task – give some preliminary explanation – for the sake of the ignorant reader.” Shortly after her return, they met. Working closely with William Farr, she documented in dramatic form, the deadly toll in Crimea and tied it to disease and lack of sanitation in “Notes Affecting the Health, Efficiency, and Hospital Administration of the British Army”, which she self-published and aggressively distributed.
Illustrated with spin wheel designs divided into 12 sectors, each one representing a month, she graphically tied improved sanitation to plummeting death rates. Understanding their long-term value, she carefully approved the paper, ink, and process that have allowed these images to remain vibrant a century and a half later. As she said later with some cynicism, they were “designed ‘to affect thro’ the Eyes what we may fail to convey to the brains of the public through their word-proof ears.” In 1858, she became the first woman to be made a fellow of the Royal Statistical Society.
In that first year of her return, she was described as “a one woman pressure group and think tank…using statistics to understand how the world worked was to understand the mind of God.”In 1860, she published her “Notes for Nursing,” selling 15,000 copies in first two months. It purposefully championed sanitation (“the proper use of fresh air, light, warmth, cleanliness, quiet, and the proper selection and administration of diet”) and promoted cleanliness as a path to godliness. It targeted “everywoman” while launching professional nursing.
II. Edwin Chadwick
Florence Nightengale was not the originator of the Sanitary Movement in “Dirty Old London.” That honor goes to one Edwin Chadwick, a barrister who, in 1848, published his “Report on the Sanitary Condition of the Laboring Population of Great Britain.” As historians and commentators have noted, there was a good case to be made for cleanliness. Here are a few of those remarks:
“The social conditions that Chadwick laid bare mapped perfectly onto the geography of epidemic disease…filth caused poverty, and not the reverse.”
“Filthy living conditions and ill health demoralized workers, leading them to seek solace and escape at the alehouse. There they spent their wages, neglected their families, abandoned church, and descended into lives of recklessness, improvidence, and vice.”
“The results were poverty and social tensions…Cleanliness, therefore, exercised a civilizing, even Christianizing, function.”
Chadwick was born on January 24, 1800. His mother died when he was an infant, and his father was a liberal politician and educator. His grandfather was a close confidant of Methodist theologian, John Wesley. Early in his life, young Chadwick pursued a career of his own in law and social reform. A skilled writer, one of his early essays that appeared in the Westminister Review was titled “Applied Science and its place in Democracy.” By the time he was 32, he focused all of expertise on social engineering – especially public health with an emphasis on sanitation
But the Sanitary Movement required population-wide participation, structural change, new technology, and effective story telling. By then, Sir William Harvey’s description of the human circulatory system in 1628, complete with pump, outlet channels, and return venous channels, was well understood by most. Seizing on the analogy, Chadwick, along with city engineers of the day, imagined an underground highway of pipes, to and from every building and home, whose branches, connected to new sanitary devices.
One of those engineers was George Jennings, the owner of South Western Pottery in Dorset, the maker of water closets and sanitary pipes. He was something of a legend in his own time, and recipient of the 1947 Medal of the Society of Arts presented by none other than Prince Albert himself.
When Jennings patented the first toilet, as a replacement for soil pots, hand transported each mornings and emptied in an outhouse if you were lucky, or in the streets if not, its’ future was anything but assured. But, by good luck, the Great Exhibition (the premier display of futuristic visionaries) was scheduled for London in 1851. The Crystal Palace exhibition was the show stopper, attracting a wide range of imagineers. Jennings “Monkey Closet” was the hit of the show. His patent application followed the next year and read: “Patent dated 23 August 1852. JOSIAH GEORGE JENNINGS, of Great Charlotte – street, Blackfriars-road, brass founder. For improvements in water-closets, in traps and valves, and in pumps.”
Modernity had arrived. And none was more enthusiastic than Thomas Crapper, a then 15-year old dreamer. Within three decades he had nine toilet patents, including one for the U-bend, an improvement on the S-bend, and an 1880 Royal commission granted to Thomas Crapper & Co. to install thirty toilets(enclosed and outfitted with cedar wood seats in the newly purchased Norfolk county seat’s Sandringham House. His reputation lives on thanks to this diminutive term, used with the greatest guttural emphasis by the Scots – CRAP. The company is also still in existence, now selling luxury models of the original design.
Sanitary engineering combined with Nightingale’s emphasis on fastidious housekeeping, and cleanliness in hospitals, enforced by nurses with religious zeal, would change the world. And they did. But those same changes would take decades to reach crowded immigrant entry points in locations like New York City.
III. The Horse and Swill Milk
One historian described it this way, “As New York City ascended from a small seaport to an international city in the 1899’s, it underwent severe growing pains. Filth, disease, and disorder ravaged the city to a degree that would horrify even the most jaded modern urban developer.”
One of the prime offenders was the noble work horse. By 1900, on the eve of wholesale arrival of motor cars, there were roughly 200,000 horses in New York City, carrying and transporting humans, and products of every size and shape, day and night along the warn down cobble stone narrow roads and alley ways.
It was a hard life for the horse, who’s lifespan on average was only 2 1/2 years. They were literally “worked to death.” In the 1800’s, 15,000 dead horses were carted away in a single year. Often, they were left to rot in the street because they were too heavy to transport. If they weren’t dying, the horses were producing manure – a startling 5 million pounds dumped on city streets each day.
As for human waste, sewer construction didn’t begin in New York until 1849, this in response to a major cholera outbreak. Clean water had arrived seven years earlier with the arrival Croton Aqueduct carrying water south from Westchester County. This was augmented with rooftop water tanks beginning in 1880. By 1902, most of the city had sewage service including the majority of the tenement houses. The Tenement Act of 1901 had required that each unit have at least one “water closet.”
As for the horses, the arrival of automobiles almost eliminated the “horse problem” overnight. Not so for cows, or more specifically the disease laden “swill milk” cows. Suppliers north of the city struggled to keep up with demand in the late 1800’s. To lower production costs, they fed their cows the cast off “swill” of local alcohol distilleries. This led to infections and a range of diseases in the bargain basement beverage sold primarily to at-risk parents and consumed by children.
Swill milk was the chief culprit in soaring infant mortality in New York City between 1880 and 2000. Annually there were some 150,000 cases of diphtheria, resulting in 15,000 deaths a year. A Swiss scientist, Edwin Klebs, identified the causative bacteria, Corynebacterium diphtheriae, in 1883. A decade later, a German scientist, Emil von Behring, dubbed the “Saviour of Children” developed an anti-toxin to diphtheria and was awarded the Nobel Prize in 1901for the achievement.
The casualties were primarily infants whose mortality rate in NYC at the time was 240 deaths per 1000 live births. Many of these would be traced back to milk infected with TB, typhoid, Strep induced Scarlet Fever and Diphtheria. The casualties were primarily infants whose mortality rate in NYC at the time was 240 deaths per 1000 live births.
The process of heating liquid to purify it, or pasteurization, was discovered by Louis Pasteur in 1856, not with milk but with wine. Its’ use on a broad scale to purify milk first gained serious traction in 1878 after Harper’s Weekly published an expose’ on “Swill Milk.” But major producers and distributors resisted regulation until 1913 when a massive typhoid epidemic from infected milk killed thousands of New York infants. But diphtheria remained the most feared killer of infants.
As Paul DeKruif wrote in his 1926 book, The Microbe Hunters, “The wards of the hospitals for sick children were melancholy with a forlorn wailing; there were gurgling coughs foretelling suffocation; on the sad rows of narrow beds were white pillows framing small faces blue with the strangling grip of an unknown hand.”
One such victim was the only child of two physicians, Abraham and Mary Putnam Jacobi whose 7-year old son, Ernst, was claimed by the disease in 1883. Working with philathroper, Nathan Straus, the Jacobi’s established pasteurized milk stations in the city which coincided with a 70% decline in infant mortality from diphtheria, tuberculosis and a range of other infectious diseases.
By 1902, the horses hero status was reclaimed as it became the source of diphtheria and tetanus anti-toxins. The bacteria were injected into the horses, and after a number of passes, serum collected from the horse was laden with protective anti-toxins, relatively safe for human use. In 1901 alone, New York City purchased and delivered 25,000 does of ant-toxin funded by the Red Cross and the Metropolitan Life Insurance Company.
Tags: crimean war > edwin chadwick > florence nightingale > george jennings > jacobi > monkey closet > nicholasI > nightingale fund > queen victoria > santa filomena > sidney herbert > straus > the microbe hunters > thomas crapper > william edward shore > william farr
“Think Microscopically” – The Birth of Cell Theory.
Posted on | April 8, 2025 | 2 Comments
Mike Magee
If there was an All-Star team for 20th Century Medicine, two members of the roster would likely be William Welch and William Osler, two of the “Big Four” founders of the Johns Hopkins School of Medicine. (The other two were surgeon William Stewart Halsted and obstetrician Howard Atwood Kelly.) Welch served as the first Dean of the school and Osler, born and bred in Canada, arrived at Johns Hopkins at the age of 40, and birthed the first residency training program at the school.
Prior to their arrival in Baltimore, the two shared a common medical origin story. They both were trained in pathology and cell theory by the famed German physician, Rudolph Virchow. He encouraged them both to embrace “attention to detail” by “thinking microscopically.”
Virchow is remembered for a famous phrase – “omnis cellula e cellula” (every cell stems from another cell). That may not sound too radical today. But back in 1855, it was revolutionary. Cells, as an entity, had been around for awhile. Two centuries earlier, in 1665, English scientist Robert Hooke, while observing a dead specimen of cork under a microscope, noted that the repetitive compartments reminded him of monk’s rows of rooms or cellula in a monastery. When he published that impression in Micrographia that year, the label “cell” was born.
He was not the first to use a microscope. That honor remains contested. But it is known that the first use of a compound microscope (an instrument which married an eye piece with an objective lens positioned near the specimen) dates to at least 1619. A half century later, in 1674, Anton van Leeuwenhoek, the Dutch scientist, first described a living algae and bacteria cells.
Over the next two centuries, cells were seen here, there, and everywhere, without many conclusions drawn. But in 1838, two German scientists – zoologist Theodor Schwann and botanist Matthias Schleiden – noted the similarities of plant and animal cells in a wide range of observed tissues. A year later, Schwann published a book claiming that: 1) The cell is the fundamental unit of structure and function in living things, and 2) All organisms are made up of one or more cells. In pushing his third insight, that new cells emerged through a process of “spontaneous generation” like crystal formation from an original cell, he found himself “a bit over his skies.” But Virchow happily corrected him 17 years later in 1855 with his “omnis cellula e cellula.”
With this insight, Virchow launched the field of cellular pathology. How exactly cells manage to divide and create identical copies of themselves remained to be determined. Bit he did figure out, before nearly all other scientists, that diseases must involve distortions or changes in cells. From this he deduced that diagnosis and ultimately treatment could now be guided not simply by symptoms or external findings, but by cellular pathologic changes as well. And this was more than theory. In fact, Virchow is credited with first describing the microscopic findings of leukemia way back in 1847.
The first description of a cell nucleus was made by a Scottish botanist, Robert Brown in 1833. Over the next half-century, cell scientists busily described various cell organelles without a clear understanding of what they did. What was clear with light microscopy was that cells were bounded by a cellular membrane.
Most of the attention in the second half of the 19th century was on the nucleus and its division and cell replication. In 1874, German biologist, Walther Flemming first described mitosis in detail. But it wasn’t until 1924 that German biologist Robert Feulgen, “following experiments using basic stains such as haematoxylin, defined chromatin as ‘the substance in the cell nucleus which takes up color during nuclear staining’”. To this day, the Feulgen reaction “still exerts an important influence in current histochemical studies.”
Watson & Crick’s description of the DNA double helix was still far in the distance. But in the mean time, other cell organelles were visualized and named like the Golgi apparatus named in 1898 after Italian biologist Camillo Golgi who also used heavy metal stains (silver nitrate or osmium tetroxide) to aid visualization. Mitochondria, like the Golgi apparatus, stretched the limits of light microscopic visualization. But even without visualization, scientists by the 1930’s were beginning to deduce the functions of organelles they could barely see, and a few (like lysosomes) that they had never seen but knew had to be there.
The electron microscope popularization (if not its discovery) is generally credited to two German PhD students, Max Knoll and Ernst Ruska, who used two magnetic lenses to generate a beam of electrons and achieve much higher magnifications in 1931. For their efforts they received the 1986 Nobel Prize in Physics. Breakthroughs began to roll out almost immediately. High resolution pictures of mitochondria appeared in 1952, followed by the Golgi apparatus in 1954. The inner workings of the cells displayed movement of vesicles across the membrane and from nucleus to cytoplasm, with structures constantly being constructed and deconstructed. And visualization only got better in 1965 when the first scanning electron microscope went commercial.
Cells vary enormously in size. The smallest free-living cellular organism is the mycoplasma bacterium. It lacks a cell wall (important since many antibiotics work against bacteria by disrupting their cell walls) and is 400,000 times smaller than a human cell. A full-grown human organism includes some 30 trillion cells. Each cell is remarkably complex, holding about 10,000 different proteins, but possessing enough directions within its DNA manual to produce up to 100,000 protein varieties. The extra 90,000 are only employed in “specialized cells.” Order is maintained inside the cell by membrane sub-compartmentalization. Specialization is reflected in the choice of macromolecules that are allied by function.
Keeping individual 30 trillion cell human collections alive was part of my learning curve as a surgical resident at the University of North Carolinas from 1973 to 1978. Our Chief of the Trauma Department was a Vietnam veteran, Dr. Herb Proctor. As the Level 1 trauma facility, the sound of helicopters overhead was a constant. And victims arriving often needed fluids fast. One of the intravenous fluids of choice was “Ringers lactate,” a substance that included sodium chloride, potassium chloride, calcium chloride and sodium bicarbonate.
The life saving formula was created by a British physician named Sydney Ringer in 1882. He came up with the formula for Ringer’s Lactate while experimenting on how to keep a frog heart (removed from the frog) beating while suspended in solution. Three years later, a budding German embryologist, Wilhelm Roux, who was fixated on “showing Darwinian processes on a cellular level,” was able to keep cells he had extracted from a chicken embryo alive for 13 days. With that, the discipline called cell culture or tissue culture was off and running.
In modern usage, the term refers to growing cells from a multicellular organism outside the body, or in vitro. Bathed in special culture medium, naturally or scientifically mutated cells, can grow and continue to divide indefinitely, creating an immortalized cell line. In the early years, these cell lines were often contaminated by other types of cells, or more commonly by infectious organisms. The discovery of antibiotics by Alexander Fleming in 1928 greatly improved the reliability of the use of these cell lines for scientific experimentation.
With the introduction of living cell cultures, and the use of the electron microscope, the inner workings of the cell were revealed. Simultaneously, the field of biochemistry matured, alongside the miracle of genetics. Side by side, in direct view, fertilization, embryonic development, multi-potential stem cells with timed specialization, organ development, and ultimately Watson and Crick’s description (building on the work of Rosalind Franklin and Maurice Wilkins) of the DNA double helix in 1953 opened the doors a half century later to the sequencing of the human cell genome in 2003 after a 13 year race to the finish line by competitors, then collaborators, NIH lead Francis Collins and the Celera Corporation CEO J. Craig Venter.
As importantly, the continued mining of cell theory and evolution of tissue culture exploded progress in cancer research and unlocked the mysteries of immunology, the workings of virology, the creation of a range of life saving vaccines from polio to mRNA cures for Covid, and much, much more.
Tags: anton van leewenhoek > canillo golgi > cell culture > cell theory > cells > ernst ruska > herb proctor > immortalized cell lines > in vitro > matthias schleiden > max knoll > ringers lactate > robert feulgen > robert Hooke > rudolph virchow > theodor schwann > tissue culture > walter flemming > william osler > william welch
The Most Profitable Convenience Store Product? Caffeinated, but not coffee.
Posted on | April 7, 2025 | No Comments
Mike Magee
If you enter a Starbucks or Dunkin’ Donuts on any given day, it is more than likely that you will exit with roughly half the maximum recommended daily dosage of caffeine in a hot or cold coffee beverage. No surprise here.
But you may be surprised to know that your caffeine hit is likely to be just as high (or higher) if you visit a Convenience Market like 7-Eleven or Cumberland Farms, but not from a cup of coffee. This is because 1/3 of all beverage sales in these outlets is in the form of caffeine super-charged energy drinks. More on that in a moment.
Convenience stores, often associated with easy access gas pumps, are ubiquitous is most parts of our nation. Take the tiny state of Rhode Island. At last tally, it has 617 Convenience stores, and that’s an increase of 1 1/4 % over last year. 279 of these stores (45%) are single-owner operations.
Packaged beverages account for roughly 16% of sales. That’s second only to cigarette sales. And when it comes to profitability, beverages are #1, contributing a whopping 20% of gross dollar sales. And among all beverages, the leader of the pack in profit margins is – you guessed it – energy drinks at 44%.
Total U.S. sales of energy drinks in 2024 was nearly 22 billion dollars. These beverages come in variety of sizes, are usually canned and refrigerated, and are packed with extras.For example, one brand (Celsius) listed in the chart above includes the following additives: Citric Acid, Taurine (an amino acid),Green Tea extract, Ascorbic Acid (Vitamin C), the artificial sweetener Sucralose, the metabolite Glucuronolactone, ginger root extract, and of course a range of vitamins and artificial flavors.
To their young fans, taste is secondary to increased attention and reaction speed. These are due to the heavy dose of caffeine. Other promised effects, like increased muscle strength and endurance, are not supported by data. What has been well documented is that exceeding a daily dose of 400mg of caffeine can cause problems including cardiac irregularities, anxiety and panic disorders, and GI disturbances.
There are currently no federal age requirements for the purchase of energy drinks.
Tags: bang > caffeine > celsius > Energy Drinks > monster > performance drinks convenience stores > red bull > rockstar > teen health
The World’s Most Widely Used Psychoactive Drug?
Posted on | April 2, 2025 | Comments Off on The World’s Most Widely Used Psychoactive Drug?
Mike Magee
Question: What is the world’s most widely used psychoactive drug?
Answer: Caffeine
In the U.S., caffeine is consumed mainly in the form of coffee, tea, and cola. But coffee dominates. Worldwide, humans consume over 10 million tons of coffee beans a year. Roughly 16% (1.62 million tons) is devoured by Americans. The daily intake of caffeine varies depending on type of beverage and brand as the chart below indicates.
On average, each American consumes approximately 164 mg of caffeine each day. That’s roughly 1 small cup of Dunkin or (3.5) 12-ounce Diet Cokes (Trump consumes at least 12 cans of Diet Coke a day).
Across the globe, daily consumption of caffeine is close to universal. Eight in 10 humans consume a caffeinated beverage daily. That makes this chemical substance the “most commonly consumed psychoactive substance globally.” Its’ popularity is related to its ability to deliver three useful physiological enhancements – wakefulness, motor performance, and cognition.
Chemically, caffeine is a close cousin of adenosine which is present in brain neurons. Adenosine builds up in synaptic connections between brain neurons. When it binds bind to special receptors, it activates neurons that promote sleepfulness. Ingested caffeine is water and lipid soluble, and therefore is able to traverse the blood-brain barrier. Once inside, its’ chemical structure mimics that of adenosine, and it occupies adenosine receptors because it shares the same approximate shape and size. When these receptors are occupied by caffeine, adenosine molecules are unable to activate the receptors. The net effect is wakefulness.
Caffeine passes thru small intestine cell walls and is absorbed within 45 minutes of ingestion. From there, it is distributed to all bodily cells reaching highest concentrations within 1-2 hours. The average time required to remove 1/2 of a caffeine dose (the half-life) is 3 to 7 hours. Thereafter it is broken down in the liver.
Over 30 plants species naturally produce caffeine. The most common source of caffeine are the seeds or beans of two coffee plants ( Coffee arabica and Coffee canephore), the leaves of tea plants, the seeds of the cocoa plant (Theobroma cacao) used in chocolate production, and kola nuts (used to produce Cola beverages).
For chocolate lovers, caffeine levels depend on the product. A 4 ounce bar of dark chocolate has approximately 80 mg of caffeine. Night time injesters may do better with milk chocolate which contains 24 mg in 4 ounces.
Pure solid caffeine is bitter, odorless, and melts at 235 degrees C. The 60’s generation were familiar with various tablet forms like No-Doz (Bristol-Myers Squibb) and Vivarin (SmithKline Beecham).. Each tablet contained 100 mg of caffeine. The U.S. market is estimated at $60 million annually. The three top consumer markets are college students (for “all-nighters”), truck drivers and body builders.
None of this is especially breaking news. The restorative powers of boiling tea leaves was first documented in 3000 BC. The Cocoa bean was harvested by Mayans in 600 BC. Coffee use is more recent, with first accounts in the Middle East in the 15th century. Three centuries later, French chemists isolated the active ingredients, with the term Caféine first appearing in the French scientific literature in 1822.
Back in 1911, Trump may have run into a problem ingesting 10 Cokes a day. Public officials viewed the product (and its stimulants) with suspicion. In fact, their seizure of 20 kegs of Coca-Cola syrup in Chattanooga, TN, led to the landmark case, United States v. Forty Barrels and Twenty Kegs of Coca-Cola. The company prevailed only to have the U.S. Congress pass a law the following year requiring that the company to include the phrase “habit-forming” on their label.
These days, caffeine consumption varies with age and sex – “2 mg/kg/day in children, 2.4 mg/kg/day in women, and 2.0 mg/kg/day in men.” As for caffeine powder tablets, they remain unregulated. Reading between the lines, experts are preaching caution a bit more often, as in this government summary in 2017:
“When taken together, the literature reviewed here suggests that ingested caffeine is relatively safe at doses typically found in commercially available foods and beverages. There are some trends in caffeine consumption, such as alcohol-mixed energy drinks, that may increase risk of harm. There are also some populations, such as pregnant women, children, and individuals with mental illness, who may also be considered vulnerable for harmful effects of caffeine. Excess caffeine consumption is increasingly being recognized by health-care professionals and by regulatory agencies as potentially harmful.”
Tags: adenosine > cafeine > caffeine > chocolate > coffee > Coke > green vs. black tea > No-Doz > tea > Vivarin > wakefulness
Vital Signs Are Vital: How We Learned To Measure Blood Pressure.
Posted on | April 1, 2025 | Comments Off on Vital Signs Are Vital: How We Learned To Measure Blood Pressure.
Mike Magee
It has been estimated that a medical student learns approximately 15,000 new words during the four years of training. One of those words is sphygmomanometer. the fancy term for a blood pressure monitor. The word is derived from the Greek σφυγμός sphygmos “pulse”, plus the scientific term manometer (from French manomètre).
While medical students are quick to memorize and learn to use the words and tools that are part of their trade, few fully appreciate the centuries long efforts, to advance incremental insights, discoveries, and engineering feats that go into these discoveries.
Most students are familiar with the name William Harvey. Without modern tools, he deduced from inference rather than direct observation that blood was pumped by a four chamber heart through a “double circulation system” directed first to the lungs and back via a “closed system” and then out again to the brain and bodily organs. In 1628, he published all of the above in an epic volume, De Motu Cordis.
Far fewer know much about Stephen Hales, who in 1733, at the age of 56, is credited with discovering the concept of “blood pressure.” A century later, the German physiologist, Johannes Müller, boldly proclaimed that Hales “discovery of the blood pressure was more important than the (Harvey) discovery of blood.”
Modern day cardiologists seem to agree. Back in 2014, the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure reported that “With every 20 mm Hg increase in systolic or 10 mm Hg increase in diastolic blood pressure, there is a doubling risk of mortality from both ischemic heart disease and stroke.”
But comparisons are toxic. No need to diminish Harvey who correctly estimated human blood volume (10 pints or 5 liters), the number of heart contractions, the amount of blood ejected with each beat, and the fact that blood was continuously recirculated – and did this all 400 years ago. But how to measure the function, and connect those measurements to an amazingly significant clinical condition like hypertension, is a remarkable tale that spanned two centuries and required international scientific cooperation.
Harvey was born in 1578 and died in 1657, twenty years before the birth of his fellow Englishman, Stephen Hales. Hales was a clergyman whose obsessive and intrusive fascination with probing the natural sciences drew sarcasm and criticism from the likes of classical scholar and sometimes friend, Thomas Twinning. He penned a memorable insult laced poem in Hales’ honor titled “The Boat of Hales.”
“Green Teddington’s serene retreat
For Philosophic studies meet,
Where the good Pastor Stephen Hales
Weighed moisture in a pair of scales,
To lingering death put Mares and Dogs,
And stripped the Skins from living Frogs,
Nature, he loved, her Works intent
To search or sometimes to torment.”
The torment line may be well justified in light of Hales own 1733 account of his historic first ever mention of the measurement of arterial blood pressure, illustrated and self-described here:
“In December I caused a mare to be tied down alive on her back; she was fourteen hands high, and about fourteen years of age; had a fistula of her withers, was neither very lean nor yet lusty; having laid open the left crural artery about three inches from her belly, I inserted into it a brass pipe whose bore was one sixth of an inch in diameter … I fixed a glass tube of nearly the same diameter which was nine feet in length: then untying the ligature of the artery, the blood rose in the tube 8 feet 3 inches perpendicular above the level of the left ventricle of the heart; … when it was at its full height it would rise and fall at and after each pulse 2, 3, or 4 inches.”
Having established the existence of “blood pressure,” the world would wait nearly another century to gain access to a reliable tool for measurement. That advance came from the hands of French physician-physicist, Jean Léonard Marie Poiseuille. He was born in 1799, amidst the flames of the French Revolution. In 1828, as a doctoral candidate, his dissertation on the use of a mercury manometer, attached to an anticoagulant laced cannula, in lab animal vessels as small as 2 mm in diameter, yielded measurable, and reproducible arterial pressure readings, earning him a gold medal from the Royal Academy of Medicine.
Carl Ludwig, a 31-year old German professor of physiology, next decided Poseuille needed a permanent and transportable record. His solution in 1847 was to attach a float with a writing pen to the open mercury column. As the mercury rose, the pen scratched out a reading on a revolving smoked drum.
But direct arterial puncture was impractical and invasive. By 1955, scientists had surmised that applying external counter pressure to an artery could obliterate the pulse below the obstruction, and that measuring the pressure generated by an obstructing external rubber ball would essentially reveal the blood pressure generated by a contracting heart – the systolic pressure.
In 1881, an Austrian physician named Karl Samuel Ritter von Basch created an elaborate portable machine that included a manometer capable of measuring the internal water pressure inside an inflatable rubber ball applied at the wrist to the radial artery. The pressure necessary to eliminate the pulse below was roughly the peak pressure of the column of blood when the heart contracted. Eight years later, the French physician, Pierre Carle Édouard Potain, replaced the water with air for compression.
By 1896, blood flow was appreciated as a series of waves that peaked when the heart contracted, and fell as the heart relaxed. The wrist compressing rubber rubber cup was replaced by an air filled cuff wrapped around the upper arm which constricted the larger brachial artery. A Russian surgeon, N.C. Korotkoff, in 1905, suggested that doctors listen to the waves rather than feel for the pulse. The sounds he described became known as Korotkoff sounds.
As described in a 1941 translation of the Russian paper and illustrated here by wikipedia, Korotkoff wrote, “On the basis of this observation, the speaker came to the conclusion that a perfectly constricted artery under normal conditions, does not emit any sounds…The sleeve is put on the middle third of the arm; the pressure in this sleeve rises rapidly until the circulation below this sleeve stops completely. At first there are no sounds whatsoever. As the mercury in the manometer drops to a certain height, there appears the first short or faint tones, the appearance of which indicates that part of the pulse wave of the blood stream has passed under the sleeve…Finally all sounds disappear. The time of disappearance of the sounds indicated the free passage or flow of the blood stream… Consequently, the reading of the manometer at this time corresponds to the minimum (diastolic) blood pressure.”
It is easy to forget, in an age of semiconductors, photocells and strain gauges, that progress in understanding the human circulatory system took centuries to acquire, and international cooperation. When Covid hit, homes that could acquired Home Blood Pressure Monitors and Pulse Oximeters that attached to an index finger and delivered oxygen saturation of blood and pulse with no delay. For a little more, you can access a Portable ECG monitor in the comfort of your own home.
We appear to have entered a new era, one where a U.S. president and his enablers are fast at work dismantling American scientific capacity and cooperative, AI-laced, international science discovery capacity. The timing couldn’t be worse, since increasingly, we patients are expected to enthusiastically participate as both providers and recipients of our own health care.
Tags: blood pressure > blood pressure monitor > ECG monitor > Jean Leonard Marie Poiseuille > Karl Samuel Ritter von Basch > Korotkoff sounds > N.C. Korotkoff > Pierre Carle Edouard Potain > pulse oximeter > sphygmomanometer > Stephen Hales > william harvey
Do You Know Your Blood Type?
Posted on | March 24, 2025 | 2 Comments
Mike Magee
Medical Science has made remarkable progress over the past 100 years, fueled by basic scientific discoveries, advances in medical technology, improved diagnostic testing, and public health programming to support, inform, and empower patients.
Progress has been sequential, with each new discovery and advance building on those preceding it. These have combined to lengthen lifespan in the U.S. by 70% since 1900. If one were to make a list of the top 10 medical advances in the 20th century, there would be a wealth of candidates, and great debate over which to include. But one candidate for certain would be safe and effective blood transfusions.
In the middle of the 20th century, most Americans knew their blood type. It was one of the earliest pieces of medical data shared with patients. Some of us may still have it enshrined on a Red Cross donor card, or remember learning it in preparation for surgery, as part of obstetrical care, or during hospitalization.
Nowadays, we know a great deal more, data-wise, about ourselves and can refresh our memories by accessing electronic health records. But, surprisingly, blood type is often absent.
Both my wife and I once knew our blood type, but were no longer certain. And blood type is not something you want to get partly right. For victims of major trauma, obstetrical patients, patients undergoing chemotherapy, and aging patients with chronic anemia, blood transfusions remain common.
Unlike the past, you don’t need a doctor’s appointment to learn your blood type. My wife and I purchased a self-testing double kit on Amazon for $18.99. My result, A+, is enshrined on the home test card above, delivered in 10 painless minutes. How this was made possible however is part of a complex history that reaches back more than four centuries.
Over 7 million Americans donate blood each year in the United States. Worldwide the number rises to some 120 million donors annually. Together Americans contribute over 12 million units of blood each year. Every two seconds, an American receives a blood transfusion in a hospital, outpatient care unit, or home setting. Each day 30,000 units of whole blood or packed red blood cells, 6000 units of platelets, and 6000 units of plasma are transfused.
For hospital patients over 64, blood transfusion is the second most common hospital procedure. 16% of recipients are in critical care units, 11% in surgical operative suites, 13% in emergency departments, 13% in outpatient units (most in the service of cancer patients), and 1.5% associated with OB-Gyn procedures including 1 out of every 83 deliveries.
Hemorrhage is the most common cause of death in 1 to 40 year olds. In the first hour after arrival at a trauma center, 25% require a transfusion and 3% of victims require more than 10 units of blood. Gun violence is the largest net consumer of blood. Compared to victims of motor vehicle accidents, falls, or non-gun assaults, gun victims consume 10 times more blood, and are 14 times more likely to die in a trauma facility. Massive transfusions themselves carry a significant risk of consumption of clotting factors, acidosis, and hypothermia.
These are the major facts. But how did we arrive at this point in time. Where did the knowledge come from? When did we understand the human circulatory system, and the components and functions of blood itself? Who came up with the idea of transferring blood from one person to another, and how did we learn to do that safely? And who created “blood banks” and why?
An abbreviated history would have to begin with William Harvey who was born on April 1, 1578 in Folkestone, England, and had the good fortune of having the town’s mayor as his father. In his youth, he was described as a “humorous but extremely precise man” who loved coffee and combing his hair in public. He was privileged, curious and studious, a “dog on a pant’s leg” kind of guy when it came to understanding one thing in particular – the human circulation.
Without modern tools, he deduced from inference rather than direct observation (aided by observations and dissections of a wide range of fish and mammals) that blood was pumped by a four chamber heart through a “double circulation system” directed first to the lungs and back via a “closed system” and then out again to the brain and bodily organs.
He correctly estimated human blood volume (10 pints or 5 liters), the number of heart contractions, the amount of blood ejected with each beat, and the fact that blood was continuously recirculated. He published all of the above in an epic volume, De Motu Cordis, in 1628. The only thing he didn’t nail down was the presence of tiny peripheral capillaries. That was added in 1660 by Marcello Malpighi who visualized the tiny channels in frog’s lungs.
All this occurred without an understanding of what blood was. It was only in 1658 that Dutch biologist Jan Swammerdam and his microscope described red blood cells.The notion of possible benefits of transfusions emerged within this same time frame. In 1667, someone tried infusing sheep blood into a sick 15-year old child. The child survived, but not for long. Other animals were attempted as well without success. Over many years other liquids were infused including human, goat, and cow milk, which yielded “adverse effects,” and led others to try saline as a blood substitute in 1884.
The problems with human to human transfusion were threefold. First, between collection form donor to delivery to recipient, the blood tended to clot. Second, there was no way of preserving non-contaminated blood for future use. And finally, the infused blood inexplicably often triggered life-threatening reactions.
Anti-coagulants, like sodium citrate, came into use at the turn of the century, addressing issue number one. The other two issues owe their resolutions in large part to an Austrian biologist named Karl Landsteiner. Through a series of experiments in 1901, he was able to recognize protein and carbohydrate appendages (or antigens) on red blood cell surfaces which were physiologically significant. He defined the main blood antigen types – A, B, AB and O – and proved that success in human blood transfusion would rely in the future on correctly matching blood types of donors and recipients. In 1923, he and his family emigrated to the U.S. where he joined the Rockefeller Institute and defined the Rh Antigen (the + and – familiar to all on their blood types) in 1937. For his efforts, he received the Nobel Prize in Physiology.
Human to human blood transfusions, from healthy to wounded serviceman, proved life-saving in WW I. But the invention of “blood banks” would not arrive until 1937. Credit goes to Bernard Fantus, a physician and Director of Therapeutics at Cook County Hospital in Chicago. A year earlier, he had studied the use of preserved blood by the warring factions in the Spanish Revolution. He was convinced that collecting donated containers of blood, correctly typed and preserved, could be life saving for subsequent well-matched recipients. His daughter, Ruth, noting that the scheme of “donors” and future “lenders” resembled a bank, is credited with the label “blood bank.”
In his first year of operation, Fantus’s “blood bank” averaged 70 transfusions a month. Techniques for separating and preserving red cells, plasma, and platelets evolved after that. And real life tragedies like the Texas City, Texas wharf fire of 1947 with mass injuries tested the system with 1,888 150cc units of pooled plasma administered to survivors of that disaster.
Additional breakthroughs came in response to the demands of WW II. Blood fractionation allowed albumin to be separated from plasma in 1940. Techniques to freeze-dry and package plasma for rapid reconstitution became essential to Navy and Army units in combat. 400cc glass bottles were finally replaced by durable and transportable plastic bags in 1947. And blood warming became the standard of care by 1963. By 1979, the shelf life of whole blood had been extended to 42 days through the use of.an anticoagulant preservative, CPDA-1 and refrigeration. Platelets are more susceptible to contamination and are generally preserved for only 7 days. The components preserved were also prescreened for a wide variety of infectious agents including HIV in 1985.
This brief history illustrates how complex and interwoven, hard-fought and critically important, are the advances in medical science. Empowered citizens today are not only the beneficiaries of these discoveries, but contributors as well. The history of blood transfusion perfectly illustrates this point. If you don’t know your blood type, finding out is a useful starting point, and donating blood remains a remarkable act of good citizenship and a lasting contribution to the health of our nation.
Tags: anti-coagulent > bernard fantus > blood > blood band > de motu cordis > hemorrhage > medical discovery > medical history > red cross > transfusion > william harvey
Science as a Tool of Diplomacy. The Brief History of Balloon Angioplasty.
Posted on | March 17, 2025 | Comments Off on Science as a Tool of Diplomacy. The Brief History of Balloon Angioplasty.
Mike Magee
“Navigating Uncertainty: The recently announced limitation from the NIH on grants is an example that will significantly reduce essential funding for research at Emory.”
Gregory L. Fenes, President, Emory University, March 5, 2025
In 1900, the U.S. life expectancy was 47 years. Between maternal deaths in child birth and infectious diseases, it is no wonder that cardiovascular disease (barely understood at the time) was an afterthought. But by 1930, as life expectancy approached 60 years, Americans stood up and took notice. They were literally dropping dead on softball fields of heart attacks.
Remarkably, despite scientific advances, nearly 1 million Americans ( 931,578) died of heart disease in 2024. That is 28% of the 3,279,857 deaths last year.
The main cause of a heart attack, as every high school student knows today, is blockage of one or more of the three main coronary arteries – each 5 to 10 centimeters long and four millimeters wide. But at the turn of the century, experts didn’t have a clue. When James Herrick first suggested blockage of the coronaries as a cause of “heart seizures” in 1912, the suggestion was met with disbelief. Seven years later, in 1919, the clinical findings for “myocardial infarction” were confirmed and now correlated with ECG abnormalities for the first time.
Scientists for some time had been aware of the anatomy of the human heart, but it wasn’t until 1929 that they actually were able to see it in action. That was when a 24-year old German medical intern in training named Werner Forssmann came up with the idea of threading a ureteral catheter through a vein in the arm into his heart.
His superiors refused permission for the experiment. But with junior accomplices, including an enamored nurse, and a radiologist in training, he secretly catheterized his own heart and injected dye revealing for the first time a live 4-chamber heart. Werner Forssmann’s “reckless action” was eventually rewarded with the 1956 Nobel Prize in Medicine. Another two years would pass before the dynamic Mason Sones, Cleveland Clinic’s director of cardiovascular disease, successfully (if inadvertently) imaged the coronary arteries themselves without inducing a heart attack in his 26-year old patient with rheumatic heart disease.
But it was the American head of all Allied Forces in World War II, turned President of the United States, Dwight D.Eisenhower, who arguably had the greatest impact on the world focus on this “public enemy #1.” His seven heart attacks, in full public view, have been credited with increasing public awareness of the condition which finally claimed his life in1969.
Cardiac catheterization soon became a relatively standard affair. Not surprisingly, less than a decade later, on September 16, 1977, anther young East German physician, Andreas Gruntzig, performed the first ballon angioplasty, but not without a bit of drama.
Dr. Gruntzig had moved to Zurich, Switzerland in pursuit of this new, non-invasive technique for opening blocked arteries. But first, he had to manufacture his own catheters. He tested them out on dogs in 1976, and excitedly shared his positive results in November that year at the 49th Scientific Session of the American Heart Association in Miami Beach.
Poster Session, Miami Beach, 1976
He returned to Zurich that year expecting swift approval to perform the procedure on a human candidate. But a year later, the Switzerland Board had still not given him a green light to use his newly improved double lumen catheter. Instead he had been invited by Dr. Richard Myler at the San Francisco Heart Institute to perform the first ever balloon coronary artery angioplasty on a wake patient.
Gruntzig arrived in May, 1977, with equipment in hand. He was able to successfully dilate the arteries of several anesthetized patients who were undergoing open heart coronary bypass surgery. But sadly, after two weeks on hold there, no appropriate candidates had emerged for a minimally invasive balloon angioplasty in a non-anesthetized heart attack patient.
In the meantime, a 38-year-old insurance salesman, Adolf Bachmann, with severe coronary artery stenosis, angina, and ECG changes had surfaced in Zurich. With verbal assurances that he might proceed, Gruntzig rushed back to Zurich. The landmark procedure at Zurich University Hospital went off without a hitch, and the rest is history.
Within a few years, Gruntzig accepted a professorship at Emory University and relocated with his family. He was welcomed as the Director of Interventional Cardiovascular Medicine.
As the Frontiers in Cardiovascular Medicine reported in 2014: “Unlike Switzerland, the United States immediately realized Grüntzig’s capacity and potential to advance cardiovascular medicine. Grüntzig was classified as a ‘national treasure’ by the authorities in 1980; however, he was never granted United States citizenship. Emory University had just received a donation of 105 million USD from the Coca-Cola Foundation (an amount which in 2014 would equal approximately 250 million USD), one of the biggest research grants ever given to an academic institution, which allowed the hospital to expand on treatment of coronary artery disease using balloon angioplasty technology.”
Gruntzig’s star rose quickly in Atlanta. His combination of showmanship, technical expertise, looks and communication skills drew an immediate response. Historians saw him as a personification of the American dream. As they recounted, “The first annual course in Atlanta was held in February 1981. More than 200 cardiologists from around the world came to see the brilliant teacher in action. The course lasted 3 and 1/2 days with one live teaching case per half day and, with each subsequent course, the momentum for angioplasty increased.”
According to Emory records, “In less than 5 years at Emory, Grüntzig performed more than 3,000 PTCA procedures, without losing a single patient.” Remarkably, after 10 symptom free years, Gruntzig’s original patient, Adolf Bachmann, allowed interventional cardiologists from Emory to re-catheterize him on September 16, 1987, the 10-year anniversary of his original procedure. The formal report documented that the artery remained open, and the patient was symptom free.
As this brief history well illustrates, science has historically been a collaborative and shared affair on the world stage. In an age where Trump simultaneously is disassembling America’s scientific discovery capabilities, undermining historic cooperation between nations, and leaving international public health initiatives in shambles, it useful to remember that institutions like Emory have well understood that science requires international cooperation, and not only has the power to heal individuals, but also is a critical tool of diplomacy.
Video: Andreas Gruntzig (in his own words).
Tags: Adolf Bachmann > Andreas Gruntzig > balloon angioplasty > cardiac catheterization > Cleveland Clinic > Dwight D. Eisenhower > Emory University > Gregory Fenes > James Herrick > Mason Sones > Richard Myler > Werner Forssmann