A Beautiful Technology: The Lost Art of Triangular Bandaging

The following post is a paper written by Maria Null, a junior majoring in Biology, Society, and Environment at the University of Minnesota, in the spring semester of 2015 for Dominique Tobbell’s class HMED 3075.  In a recently published pair of articles in the Bulletin of the History of Medicine, Dominique Tobbell and Lois Hendrickson described  their use of historical artifacts (from the Wangensteen Historical Library) in their history of medicine courses.  Ms. Null’s paper is the second of three papers offered as examples of the work students have done in their classes.

Bandages have demonstrated power as a medical technology in their many varieties, applications, and restorative health qualities throughout the history of humanity. Bandaging as a ‘practice’ has been characterized as an ‘art’ in the literature of medicine, nursing, and mortuary sciences cross-culturally.  Notably referred to as a “dying art,” early bandaging has been evidenced by the Egyptian practices of mummification, biblical texts, and popular fictional literature.[1] The prevalence of injuries warranting bandaging in war, medicine, and civilian life are interrelated and their treatments informed by the dissemination of knowledge by those who have studied, practiced, and mastered the “art of bandaging.”

This paper will argue that the improvement of the materials of the triangular bandage, its notable efficacy in sustaining life after being wounded, and the dissemination of its associated knowledge for application during the First World War (1914-1918) contributed to its decline in status as a medical technology through reforms in nursing and ‘first-aid’ education.

“Bandage,” is derived from the French term “bande” meaning an article used to secure an injury and to bind it.[2] It is important, for the purposes of this essay to distinguish between two terms most commonly used interchangeably for one another- “dressing” and “bandage.” In this text, “dressings” will be shortly examined only as they are relevant to the application of a “bandage” to bind the dressing to a portion of the human body.  “Bandage” will refer to an apparatus used to either bind a dressing, support a portion of the body, or directly cover a portion of the body during an emergency situation- and in this way, acting as what the reader may mistakenly conceive to be a “dressing.”

The rise of the modern bandage, in its many forms, was coupled by increased knowledge of human anatomy and physiology, which enabled physicians and nurses to designate specific bandages for unique applications to certain portions of the human body. Four unique bandage types have been popularly employed in nursing since the early nineteenth century: the roller bandage, the four-tailed bandage, the scultetus, and the triangular bandage.[3]  Each bandage was used for a specific purpose, however each was credited with the capacity to retain dressings or splints.[4]  Amongst the four popular bandage types, the late 1800s saw the rise of what became known as the ‘triangular bandage.’  In 1831, Swiss surgeon Mathias Mayor was first credited for acknowledging the utility of what he called the ‘handkerchief bandage.’[5]  But the ‘handkerchief bandage’ remained unpopular in medicine until nearly forty years later, when it was used in the field of battle by German surgeon, Professor Johannes Friedrich von Esmarch.[6]  In texts and according to manufacturers, Esmarch’s “discovery” defined the triangular bandage throughout the First World War.  Indeed, he was the first to suggest printing the illustrations for use, which would come to uniquely characterize the triangular bandage.[7]  However, his accomplishment has been contested by medical doctors throughout history due to the leniency with which the triangular bandage may be defined as a technology. [8]

Figure 1. Professor Esmarch’s Bandage with Printed Graphic Illustrating Use

Picture1

Esmarch’s Bandage: The triangular bandage is depicted with elaborate illustration to inform the user of its many applications.[9] Image captured at the Wangensteen Library.

The triangular bandage as a technology is defined by its existence as a physical object, its versatile functions, and the knowledge needed to apply it.[10]  Early nursing texts and supplementary readings suggest the bandage was amorphously fabricated by clothing, linen, pillowcases, handkerchiefs, and bed sheets in dire emergencies. [11] Additional texts inform the physical fabrication and measurements of the technology when bandages were produced to act as a medical technology. Bleached or unbleached muslin or calico, linen, silk, or gauze was utilized in instruction and treatment of injury.[12]  The physical measurements of triangular bandages varied and were determined by user selection and eventually by the manufacturer.  Texts suggest an appropriate median measurement for the triangular bandage of the 1910s was approximately one square yard.[13] The bandage only serviced its many functions upon folding, which was illustrated in numerous nursing texts and additional ‘first-aid’ handbooks.[14]

Figure 2. A Manual of Instruction in Folding the Triangular Bandage

Picture2

Standard illustration with instructional text in aid book.  Figure 59 depicts starting materials, figure 60 the triangular bandage. Figures 61 through 65 depict conversion of the triangular bandage into a compact for storage or transport. Figures 66 and 67 illustrate the folds of the triangular bandage into a ‘cravat’, and subsequent rolling into its ‘cord’ form.[15]

Among its many functions, arresting hemorrhaging was of particular importance in the field of battle- a function requiring extensive knowledge of the many utilities of the triangular bandage for the proper treatment to be achieved. After its introduction to ambulance work in the field by Esmarch in the late 1800s, physicians began producing texts characterized by elaborate illustration and in-depth instruction on triangular bandaging.[16] Understanding texts from the late 1800s through the mid-1920s required knowledge of medical terminology and anatomy, as well as the motivation to learn and practice countless variations on the bandaging techniques.  Knowledge of the most useful materials and the speed, neatness, and proper tension required of the application were integral to the success of the treatment, and often the survival of the patient.  As the U.S. entered into the First World War, few collective groups in America possessed the same drive and commitment to establishing the agency of the triangular bandage overseas as the United States Army Nursing Corps.

Nurses, patients, first responders, and surgeons observed and experienced the impact of the triangular bandage in medical practice. As a bandage with the primary purpose of serving in emergency situations, first responders were typical users. First responders in the field of battle were soldiers of the U.S. army who were equipped with the bandage in early ‘first-aid kits.’ A ‘first-aider’ as defined by Major Charles Lynch of the Medical Corps of the U.S. Army, was any individual intervening during a medical emergency prior to the summoning or arrival of a physician. [17] Field hospitals stocked with nurses were frontlines of ‘first-aid.’  Triage and dressing stations saw casualties, and upon assessment, nurses donned soldiers with more elaborate triangular bandages. Bandages in field hospitals were not themselves any more complex in fabrication than their field ‘first-aid’ counterparts but they were, however, embodied with greater power to treat injury by the nurses with the knowledge and experience of manipulating them.  By 1917 the national curriculum for nursing education had outlined coursework for instruction in the use of the triangular bandage.  Elementary bandaging was instructed throughout five classes over a time span of ten hours.  Instructors or “competent” head-nurses in association with surgery, orthopedics, and first-aid taught classes.  The emphasis on critical manipulation, practice, and demonstration of bandaging skills required speed, efficiency and dexterity of nurses.  Likely, two or fewer hours were spent on instruction of the triangular bandage- which was acknowledged as a first aid utility requiring additional education and training.[18] An additional ten hours of instruction on elementary nursing and first aid focused on preparing nurses to adapt readily to emergency situations, much like those in the army.  The outline of curriculum emphasized the additional training of nurses required of them upon entry into the Army Nursing Corps.[19]

The evolution of the triangular bandage, in its fabrication and dimensions, contributed to its increasingly widespread use, manufacturing, and distribution to civilians.  While the bandage was readily available to any user with the knowledge of textiles, which could be manipulated for bandaging purposes, it was not until the mass production of the ‘Esmarch bandage’ by American consumer healthcare company Johnson & Johnson that civilians saw standard triangular bandages in their very own ‘First-Aid Kits.’  By 1917, Johnson & Johnson was regularly producing a triangular bandage for civilian ‘First-Aid Kits.’[20] The 36 inch squared ‘Esmarch Bandage,’ was a staple of Kits, originally supplied to soldiers as early as the Spanish American war in the late 1800s, when the company had first entered into war-time production efforts.[21] In notes on manufacturing for the American Red Cross, Johnson & Johnson included an “explanation of numbers shown on figures in illustration of the Esmarch Bandage.”[22] The graphic illustration attempted to recreate the iconic print characteristic of Professor Esmarch’s original fabrications.  The widespread development and distribution of the bandage extended the user base and established a market for American consumers of first-aid products.

Figure 3. Johnson & Johnson’s Manufactured “Esmarch Bandage”

Picture3

Johnson & Johnson’s triangular bandage named for Professor Esmarch of Kiel. The bandage mimiced Esmarch’s original design with graphic illustration for the dissemination of knowledge of the bandage’s utilities.[23]

Bandages for purchase were fabricated from surgical gauze or muslin and were available at drug stores.  Not only were they staples of the ‘First-Aid Kit,’ but they were also included in the ‘Johnson’s First Aid Cabinet,’ and the popular packet known as ‘Johnson’s First Aid for Wounds.’[24]  Both were features of aid in the railroad industry, manufacturing establishments, and schools.  The importance of ‘first-aid,’ and responsibility of citizens to learn about it and administer it was demonstrated in Johnson’s First Aid Manual.  The manual depicted improved bandaging materials and denounced the improvisation of triangular bandages; in effect suggesting that efficacy of treatment required the newer and standardized ‘first-aid’ materials supplied by the manufacturer.[25]

Indeed, Johnson & Johnson boasted the efficacy of the triangular bandage suggesting, “It is probable no other system of wound dressing can accomplish so much, and in such a reliable manner, in rendering first aid, as the use of the triangular bandage.[26]  Improved function with the use of the triangular bandage coupled its improved materials.  As the bandage was developed, altered in size to be more compact, and its materials made to be more durable, ‘first-aid’ was rendered more successfully by soldiers, nurses, and civilians. This was demonstrated in part by curriculum change in nursing from 1917 to 1933, which incorporated the new text published by Instructor of Surgery at the University of Pennsylvania, A.D. Whiting: Bandaging.  In his text Whiting outlined the effect of the elimination of gauze from use as a fabric for the triangular bandage. While gauze had been a central fabric to earlier triangular bandages, Whiting proclaimed the improved treatment of injury achieved by bandages made of different fabric.  Whiting suggested that gauze was not sturdy enough to exercise the utilities of the bandage, and rather bleached or unbleached muslin had been proven materials for increasing proper bandaging and sustaining life in emergency situations.[27]  Whiting’s text became the standard for bandaging in nursing curriculum following the First World War and represented the transforming identity of the bandage in several key ways.

While the medical terminology in Whiting’s text maintained the integrity of the triangular bandage as a medical technology within the surgical and nursing professions (including the ‘Occipitofrontal Triangle,’ the ‘Iliofemoral Triangle,’ and the ‘Mentovertico-occipital Cravat), American citizens in the 1910s and 20s were increasingly exposed to new language in popularized ‘first-aid’ education.[28]  Johnson & Johnson not only served as a manufacturer contributing to the physical evolution of the triangular bandage, but they also served as a primary source of education on the procedures of ‘first-aid’ as given with their products.  Johnson’s First Aid Manual, in its many additions, used language contradictory to that of Whiting’s and other medical texts in the early 1910s.  A 1917 edition of Johnson’s explicitly stated, “no instructions are given in this Manual in respect to anatomy or physiology. A knowledge of these subjects is not deemed essential either to the intelligent use of the manual or the application of first aid.”[29]  The certainty with which the Manual discerned  “non-essential” features of medicine in ‘first-aid’ contributed to a shifting paradigm in U.S. ‘first-aid’ culture.  With the improvement of the bandage nearing the end of the First World War, manufacturers, educators, and the American Red Cross began to simplify and consolidate aid education to promote the identity of the ‘first-aider.’

In early texts, the complexity of the triangular bandage as a technology is evident. While the triangular bandage was free in form, it was manipulated by adhering to the principles of geometry. Rudimentary knowledge of the subject was required of users who were instructed to fold the triangular bandage relative to its features: bases, sides, apexes, squares, quadrilaterals, triangles within larger triangles, extremities and ends, and angles of the triangle.[30] ‘Broad’ and ‘narrow’ folds of the bandage and its shape as a ‘cravat’ and ‘cord’ classified its many configurations.[31] Efficacy of the treatment and preservation of life required knowledge of each of these features.  Additionally, knowledge of how and where to apply a bandage, and with certain pressure, was essential. A majority of texts alluded to the error, which would result from applying a triangular bandage ineffectively, resulting in the continuity of hemorrhaging or arresting of circulation. While it was often assumed that any attempt at immediate aid (erroneous or informed) would increase chances of survival, texts and aid books asserted that an improperly applied bandage could just as likely harm a patient as hurt them.[32] Extensive study, and practice manipulating the triangular bandage was required of nurses to master the ‘art of bandaging.’ While acknowledging the excessive skill possessed by nurses who “perfected” the “art” of application of the triangular bandage in ‘first-aid,’ virtually no texts awarded them with prestige.  This reflects the position of the status in the historical context- nurses characterized by tasks of manual dexterity requiring little subjective analysis of procedures.[33] The bandage itself, in contrast, had reached a new status as an efficacious medical technology as its ‘first-aid’ properties were realized by American civilians following the First World War.

The distribution of knowledge of the bandage to civilians was characterized by a change in the language of texts, illustrations and photographs and demonstrations, and conceptualization of the status of the bandage as a technology.  From descriptions of nearly twenty specialized triangular bandages in medical texts such as Whiting’s, to only eight in First Aid in Emergencies, the triangular bandage experienced a reduction in its versatility once adopted by the civilian ‘first-aider.’[34] Fewer varieties of the triangular bandage were included in texts intended for civilian education in aid. Texts, which once required understanding of medical terminology, anatomy, and geometry, were modified for the American consumer.  While highly descriptive texts instructed bandaging in the late 1800s, the 1910s saw the efficient integration of photographs with fewer lengthy text inserts.  Willing civilians could learn by following step-by-step depictions of bandaging, featured in numerous publications at the time. In her Illustrations of Bandaging and First-Aid, registered nurse Lois Oakes produced knowledge for the public eye- once reserved for the production and consumption by surgeons and nurses.  Oakes’ publication thoroughly depicted the many functions of the triangular bandage, attempting to preserve its status as a complex technology. The illustrations and review by the American Journal of Nursing, however, reflected the declining status of the bandage and the popular assumption that any willing individual could become skilled in administering aid.[35]  The journal’s review proclaimed Oakes illustrated the use of the bandage “so plainly that even an inexperienced person could study them with advantage.”[36] While not declaring the civilian’s capacity to supersede the skills of a trained nurse, the journal was suggesting that the civilian could acquire the skills once reserved for nurses. Additionally, demonstrations and lectures on first-aid, known as “ambulance” work, became popularized with the public through the American Red Cross several years into the First World War.  A 1917 article in the Washington Post reflected varieties of newspaper clips from the time period: lectures on first aid and bandaging for the public.[37] These changes reflected the American attitude toward ‘first-aid’ by the end of the war- that the civilian had the agency and responsibility to become a competent first-aid responder.

Through evident changes in nursing education and ‘first-aid’ curriculum adoption, it is possible to examine the affects the evolution of the triangular bandage ultimately had on its own declining status as a medical technology.  While the bandage was increasingly manufactured to meet consumer demand in a new age of civilian ‘first-aid,’ a job that had once formally belonged to a nurse, underwent scrutiny.  Updates to the national curriculum in nursing by 1933 reflected a subtle but important decline in time dedicated to bandaging training.  Course time in elementary bandaging had been decreased from 10 to eight hours of instruction.[38]  In contrast, however, there was a five-hour increase in emergency nursing and first aid.  This increase in instructional training (following increased demand following the First World War) might have reflected increased emphasis on emergency bandaging techniques, had the triangular bandage maintained its status as a medical technology.  Rather, the opposite occurred.  Coursework objectives outlined Army and Red Cross nursing equipment training, training on wounds, fractures, and strains, with no explicit mention of bandaging.[39]  Red Cross texts listed as supplementary materials suggest a shift in American aid culture as the U.S. affiliate of the International Federation of the Red Cross and Red Crescent Societies was officially integrated into American nursing education.  The motivation of the American Red Cross in aid education was continuously transformed as America adopted ‘first-aid’ culture after the triangular bandage was made accessible.  Educating citizens was not only a public health measure, but also a market move in conjunction with consumer healthcare company Johnson & Johnson.  The triangular bandage and its manual in the ‘First-Aid Kit’ of the home, the factory, and the school empowered citizens and subsequently lowered its status as a medical technology in the hands of the educated, practiced, and masterfully skilled artists of bandaging: America’s nurses.

Gradually, references to the Esmarch bandage began to disappear from texts by the 1940s. Due in part to the emergence of newer technologies (some developed by Esmarch himself, including a rubber bandage), but largely determined by declining status as a medical technology, the complexity of the triangular bandage fell out of favor with clinical and civilian texts.  Many outside actors had established the importance of first-aid, namely, manufacturer Johnson & Johnson backed by the American Red Cross.  The ‘art of bandaging’ so masterfully executed by trained nurses was simplified and condensed to teach the civilian to adopt the new identity of the American ‘first-aider’: an obligated responder in emergency situations.  The relegation of the triangular bandage largely contributed to its disappearance as a prominent feature of nursing curriculum.  However, the change was successful in establishing a solid foundation of ‘first-aid’ for American citizens.  From a soldier’s pocket to the nurse’s field hospital to the hands of the American adolescent in ‘first-aid’ class, the triangular bandage was one of the most versatile medical technologies to ever reach the hands of the American citizen.

[1] Herrmann, E.K. “The Dying Art of Bandaging,” Western Journal of Nursing Research Vol. 14, No.6 (1992): 791.

[2] Ambulance Work and Nursing (Chicago: W. T. Keener & Co., 1899), 67.

[3] Committee on Education of the National League of Nursing Education, Standard Curriculum for Schools of Nursing. (Baltimore: Waverly Press,1917), 86.

[4] Albert S. Marrow, The Immediate Care of the Injured (Philadelphia and London: W.B. Saunders Company, 1906), 108.

[5] Little, Vincent J. “The Fabric of First Aid: A History of the Triangular Bandage.” Pharmacy History Australia: The Newsletter of the Australian Academy for the History of Pharmacy no.9 (1999): 10.

[6] Ibid

[7] Ambulance Work, 67.

[8] J.M. Grant M.D. “Professor Esmarch’s Triangular Bandage,” The Lancet (1874): 746.

[9] Johannes Friedreich von Esmarch, Samariterbriefe (Kiel: Verlag von Lipsius & Tischer, 1886), 29.

[10] Joel D. Howell, Technology in the Hospital: Transforming Patient Care in the

Early Twentieth Century (Baltimore: Johns Hopkins University Press, 1995), 8.

[11] Major Charles Lynch, American Red Cross Abridged Text-Book on First Aid: Women’s Addition; a Manual of Instruction (Philadelphia: P. Blakiston’s Son & Co., 1913), 10; Alvah H. Doty, M.D., A Manual of Instruction in Principles of Prompt Aid to the Injured: Designed for Military and Civil Use (New York: D. Appleton and Company, 1890), 77.

[12] Henry R. Wharton, M.D., Minor Surgery and Bandaging (Philadelphia and New York: Lea Brothers & Co., 1902), 17.

[13] Albert S. Marrow, The Immediate Care of the Injured, 134.

[14] Ambulance Work, 68.

[15] Ibid

[16] Henry C. Leonard, A Manual of Bandaging: Adapted for Self-Instruction. (Detroit: Daily Post Book Printing Establishment, 1876).

[17] Lynch, Charles, American Red Cross Abridged, 2.

[18] Committee on Education. Standard Curriculum for Schools of Nursing, 85.

[19] Committee on Education. Standard Curriculum for Schools of Nursing, 117.

[20] W.G. Stimpson. Prevention of Disease and Care of the Sick, (Washington: Government Printing Office, 1917), 224.

[21] Ibid;General Index to Red Cross Notes, (New Brunswick: Johnson & Johnson, 1900), 18.

[22] General Index to Red Cross Notes, 104.

[23] A.D. Whiting, Bandaging, (Philadelphia and London: W. B. Saunders Company, 1915), 131.

[24] Johnson’s First Aid Manual, (Baltimore: Johnson & Johnson, 1917), 59.

[25] Johnson’s First Aid Manual, 69-75.

[26] Johnson’s First Aid Manual, 69.

[27] A.D. Whiting, Preface to Bandaging, 7.

[28] A.D. Whiting, Bandaging, 132-143.

[29] Johnson’s First Aid Manual, 3.

[30] Ambulence Work, 68.

[31] Ambulence Work, 69.

[32] Charles Lynch, American Red Cross Abridged, 1.

[33] Dominique Tobbell, “Nursing In the Early 20th Century.” (Lecture presented, Minneapolis, Minnesota, October 05, 2015).

[34] A.D. Whiting, Bandaging, 10; Eldridge L. Eliason, in Contents of First Aid in Emergencies, (Philadelphia and London: J.B. Lincott Company, 1915), v.

[35] Lois Oakes. Illustrations of Bandaging and First-Aid (Baltimore: The Williams and Wilkins Company, 1942).

[36] “Book Reviews” American Journal of Nursing. Vol 41, No. 1 (1941): 130.

[37] “First Aid Advice in Red Cross Lecture,” Washington Post, March 22nd, 1917, 4.

[38] Committee on Education of the National League of Nursing Education, A Curriculum for Schools of Nursing, (New York: National League of Nursing Education, 1932), 108.

[39] Committee on Education , A Curriculum, 155.

 

 

Electrocardiogram and Diphtheria in the early 20th Century

The following post is a paper written by Matthew Cohen, a senior majoring in Biology, Society, and Environment at the University of Minnesota, in the spring semester of 2015 for Dominique Tobbell’s class HMED 3075.  In a recently published pair of articles in the Bulletin of the History of Medicine, Dominique Tobbell and Lois Hendrickson described  their use of historical artifacts (from the Wangensteen Historical Library) in their history of medicine courses.  Mr. Cohen’s paper is the first of three papers offered as examples of the work students have done in their classes.

The electrocardiogram’s ability to observe the conduction of the heart, changed medical practice for diphtheria in the early 20th century. My paper will argue that the implementation of the electrocardiogram by physicians changed the diagnosis, monitoring and treatment of diphtheria infected patients in American from 1900-1938. I will be using the electrocardiogram from the Wangensteen historical library collection. This paper will discuss how the electrocardiogram functions, and the implications and clinical relevance of using the electrocardiogram to diagnose the infectious disease diphtheria.

ecg2

To understand the impact and effects of the electrocardiogram, it must first be defined as a medical technology. The electrocardiogram (ECG) is a medical technology that is used to record the electrical signals of the heart. According to Joel Howell’s definition of technology, the ECG is medical technology by virtue of being a physical object, the activity it performs, and the technological know-how needed to use it.[1] The ECG is relatively easy to use and produces an accurate recording of the heart’s electrical signals however, the scientific know-how to understand the results is what makes the ECG a medical technology. The information recorded by the ECG must be interpreted by a person skilled to equate the physical tracing to the physiological functions of the heart. This means that an anatomical and physiological understanding of the heart is required to accurately understand the results.

The heart has two main components that must be considered, the physical heart and its electrical system. The heart is a muscle, comprised of specialized cardiac cells which, under the influence of an electrical current, contracts to force blood throughout the body.[2] The heart has four anatomically distinct regions. The upper portion of the heart is divided into two openings called the atria and the lower portion of the heart is divided into two openings called the ventricles. These openings are filled with blood, which is expelled from the heart to the body during systolic contracting. This is the physical pumping system of the heart, which must work in conjunction with the electrical system.

The coordinated contraction of the atria and ventricles is controlled by the heart’s electrical system. The ECG records this electrical current, which is a representation of the heart’s physical functioning. The electrical signal passes through a well-established pathway inside of the heart. Cardiac tissue has unique properties, which allow it to generate its own electrical impulses. These electrical impulses, under normal cardiac function, originate in a specific cluster of cardiac cells, called the Sinoatrial (SA) node. The electrical impulses then travel via the intra atrial pathway to the Atrioventricular (AV) node.[3] This results in the contraction of the left and right atria. The electrical impulse then travels through the bundle of His and down the purkinje fibers, resulting in the contraction of the left and right ventricle.[4] This is the normal pathway of the heart’s electrical activity, called a normal sinus rhythm. Since all cardiac cells have the potential to generate an electrical impulse, abnormal conductive pathways can also occur.

The invention of the ECG is credited to Willem Einthoven in 1901.[5] Einthoven developed the standardized ECG pattern, which displayed the electrical pathway discussed previously. Einthoven’s work led him to develop the 3 limb lead placement, now known as precordial leads. The 3 limb lead placement creates an equilateral triangle which is used to capture the hearts electrical activity and is still used today.[6]

The essentials of the electrocardiogram have remained largely unchanged since its development. Einthoven’s early ECG was enormous- weighing over 600 pounds, required two rooms and five people to operate.[7] Einthoven’s ECG measured fluctuation in electrical current in a wire that was suspended between electromagnetics.[8] The results were recorded onto a photographic plate. The electrocardiogram in the Wangensteen historical library collection was made by the Cambridge Instrument Company and it was given the name “Simpli-Scribe Portable Model”.[9] This ECG was designed to be portable, with an exterior made of wood and a metal carrying handle.[10] This version of the ECG did not have a date of manufacture but Cambridge produced the Simpli Scribe from 1945-1960.[11] The Simpli Scribe is approximately one foot cubed in size, and is powered by a two prong electrical cord. The five leads on the ECG are attached to the patient via a strap placed on the chest. The resulting electrical fluctuations are recorded by an electrically heated metal needle that scorches the reading into the paper.[12] While the Simpli-Scribe, is smaller and more refined than the earliest electrocardiogram’s designed by Einthoven, the basics still remained unchanged. Both required electromagnets, recorded fluctuation in electrical activity, and used wires attached to a patient at the same locations.

The use of the ECG in recording cardiac activity was well known by the mid 1910s. The electrocardiogram was hailed as an unquestionable way to determine if one was sick or well.[13] The clinical application of the ECG for monitoring cardiac rhythms such as auricle fibrillation and extra systole beats were well documented.[14] In 1916, the British army adapted the electrocardiogram as a means to screen army recruits, in conjunction with X-rays, to determine their cardiac health and fitness to serve.[15] By this time period, the ECG was already recognized for its ability to determine cardiac health. The electrocardiogram was so prevalent, that by 1924, Willem Einthoven was awarded the Nobel Prize for Physiology and Medicine for his discovery.[16] While the electrocardiogram was well known for its ability to record the heart’s electrical activity, its use in medicine during the time was limited to routine physical exams.  However, the importance of the ECG changed when the electrocardiogram was used to monitor and diagnose cardiac issues in infectious diseases such as diphtheria.

Diphtheria is an infectious disease caused by the bacteria Corynebacterium diphtheriae that is spread from person to person through respiratory droplets.[17] The disease targets the mucus membranes of the mouth and nose where they can travel to the respiratory system.  Diphtheria will produce symptoms similar to a common cold, such as weakness, sore throat and low grade fever.[18] A thick coating will build up in the mouth and nose, making it difficult to breathe. While these respiratory symptoms cause difficulty breathing, the majority of deaths associated with diphtheria are the result of the toxin produced by the bacteria.[19] The diphtheria toxin (DT) can travel into the blood stream, where it will damage internal organs such as the heart, lungs and kidneys.[20] The damage caused to the heart is called myocarditis. Myocarditis is an inflammation of the heart tissue, which results in mortality as high as fifty percent if left untreated in patients.[21]

Diphtheria was a widespread infectious disease in the United States in the early 20th century. In 1921 there were 206,000 confirmed cases of diphtheria which resulted in 15,520 deaths.[22] The population of the United States in 1921 was 108,538,000 according to the United States Census Bureau.[23]  This means that 0.18% of the U.S. population in 1921 was infected with diphtheria. There was still a strong prevalence of diphtheria in the U.S., even after the introduction of a vaccine in 1923.[24] The vaccine provided an effective means of prevention but treatments for the afflicted remained the same. The strong presence of diphtheria is shown by specialized hospital wards dedicated to diphtheria patients, fifteen years after the vaccine was introduced.[25]

By the mid to late 1920s the effects of the diphtheria toxin on the neuro-musculature and fatty degeneration of the heart were well known, and understood by physicians to be longer lasting than previously thought. Jenner Hoskin, a doctor at the Philadelphia Hospital conducted a study on the effects of diphtheria on the heart in 1926. Hoskin discovered that upon admission to the hospital, 72% of patients had no cardiac complication with the exception of tachycardia, but several days later 28% of patients were found to have abnormal pulses which degenerated into grave heart trouble.[26] While it was not clear what Hoskin defined as “grave heart trouble” the implications were clear; diphtheria had deadly effects on the heart. Hoskin noted that virtually all patients admitted for diphtheria initially had tachycardia that persisted for 36-48 hours after initial treatment was given.[27] The continued presence of tachycardia following the first 36-48 hours was an indication of myocarditis, a result of the diphtheria toxin damaging the heart.[28] Hoskin’s use of persistent tachycardia over the treatment interval was common practice at the time. Hoskin argues for the importance of electrocardiograms in identifying myocarditis secondary to diphtheria and monitoring the patient’s heart even after their hospital stay.[29] Close monitoring of the heart can result in early recognition of cardiac complications and reduce mortality rates. Physicians in the early 20th century recognized that death as a result of diphtheria typically occurred on the 10th day of infection from acute myocarditis, or on the 3rd week from fatty degeneration of the heart muscle.[30] This highlights the importance of having an effective way to monitor the cardiac function of the heart, which the electrocardiogram does effectively.

Prior to the wide spread use of the ECG in monitoring heart function, a physician would assess heart health through auscultation of the heart, palpation of pulses, physical appearance of the patient and subject statements from patient.[31] Auscultation of cardiac sounds using the stethoscope was the preferred method.[32] The physician would listen to cardiac sounds at specific points on the chest to determine the health of the patient’s cardiac valves. The strength of the heart would also be assessed by feeling the pulse, typically at the wrist, to determine the regularity and strength of the heart.[33] The physical appearance of the patient was noted, pallor or a pale appearance was considered a sign of poor perfusion and attributed to cardiac insufficiency.[34] These assessments were subjective and depended on the knowledge of the individual physician to be able to identify these abnormalities.

Even in patients that survived the disease, lasting cardiac abnormalities were found. Cardiac abnormalities such as premature beats, diminished cardiac reserve, heart enlargement and tachycardia were observed. Hoskin stated that the lasting effects of diphtheria on the heart can be seen in the “slurring of the Q.R.S. complex” visible on ECG.[35] The Q.R.S. complex represents the depolarization of the ventricles and the corresponding contraction. Cardiac abnormalities that resulted from the infection would need to be monitored over the patient’s life to ensure detection of worsening cardiac issues. Hoskin stated that it is of the “utmost importance that patients in whom cardiac symptoms and abnormalities have been discovered be thoroughly examined both clinically and electrocardiographically” and that these examinations must continue until it is resolved or reaches a stable condition.[36] The ECG can be utilized to monitor and diagnose diseases that have an impact on the heart but are not cardiac in nature, such as diphtheria. Diphtheria may have no effect on the electrical conduction system of the heart, but it will result in physical changes to the heart tissue which can be seen on the ECG tracing, such as Q.R.S. complex slurring.

The shift in prevalence of the ECG in cardiac monitoring can be seen in a 1938 article by Mason Leete in which he states it is “presumptuous to enter such a field [diphtheria] armed only with fingers, stethoscope and sphygmomanometer.”[37] This displays an attitude shift during the late 1930s in which the subjective information gathered by the physician decreased in importance and the objective information created by medical technology such as the ECG increased in significance.  The importance of the ECG in heart health was noted but traditional measures of obtaining pulses, heart sounds and more recently blood pressures were still valued.[38] The combined use of subjective information obtained by the physician and unbiased technologically obtained data, such as from the electrocardiogram and sphygmomanometer, are valued as the most effective way to monitor patient’s disease progress. A careful comparison of clinical and electrocardiographic findings are required to thoroughly assess the patient’s health and provide an accurate prognosis.[39]

In treating diphtheria and monitoring the progress of infected patients, the use of the electrocardiogram became the standard. The importance of the ECG in diphtheria patients is emphasized in Harries’s 1932 article titled “Auxiliary treatment of toxic Diphtheria, when he states that “obtaining repeat electrocardiograms from the cases under treatment . . . provid[es] the most valuable evidence as to progress.”[40] This statement displays the value placed on ECG’s in diphtheria patients as the most valuable tool to monitor progress, even over the physical observations of the physician. This shift in preference is mirrored in a 1935 article on electrocardiogram in diphtheria in which the author states that “although there is much agreement between electrocardiographic and clinical signs the former usually proceeds the later and change in the curves may demonstrate lesions which cannot be discovered by other means.”[41] This is an important shift from years prior when the ECG was seen as the primary tool for not only monitoring patients but also diagnosing previously unseen or unobservable lesions of infection. The instrument is sensitive enough and its accuracy is trusted to the point that the information displayed by the ECG is more accurate than what the physician can observe. The importance of the ECG was shown in an 1937 article by Norman Begg in which he states that “Survival depends upon the amount and distribution of undamaged or lesser damaged myocardial tissue,” the severity of which could be determined by the ECG.[42] The value of the ECG to monitor the patient’s progress and prognosis was well known by this time period, but the ECG also influenced treatment of diphtheria.

The medicine used to treat diphtheria remained relatively unchanged from 1899 to the late 1930s. Early treatment of diphtheria consisted of the use of antitoxin for the diphtheria bacteria, although the dosages varied significantly based on the preference of the physician providing care.[43] Additionally, the use of mercurial was advocated for, which was mercury in a solution, generally in a topical form placed on to the lesions of infection and oxygen therapy as needed to treat difficulty breathing.[44] There is no mention of using electrocardiograms to monitor cardiac health. This picture changed dramatically in the mid 1920s with the prevalence of the ECG. While many of the medicines used to treat diphtheria remained unchanged, such as the use of antitoxin, the use of digitalis became prevalent to control the tachycardia that resulted from the diphtheria toxin.[45] This led to the prevalence of the ECG as a tool to determine when certain medicines are indicated to treat the patient.

The electrocardiogram was also important in its influence on treatments given to diphtheria patients. The ECG was used to determine the extent of cardiac toxicity from the diphtheria infected patients and excluded the use of certain drugs, that in milder cases of diphtheria would have been indicated. Digitalis is an example of when an ECG would be used to rule out the use of digitalis for that patient due to cardiac damage.[46] The drug digitalis is well known for its effects on decreasing heart rate, increasing strength of contractions, and increasing blood pressure.[47] Digitalis was generally indicated for use to treat tachycardia in moderate to severe diphtheria infections, determined by electrocardiograph, was contraindicated due to its increased cardiac demand.[48] The ECG was used to not only determine abnormal conduction of the heart but also as a clinical tool to determine if a medicine such as digitalis is safe for the patient.

The ECG was widely used in monitoring diphtheria patients and had a strong influence on treatments rendered. The cases of diphtheria rapidly decreased in the late 1940s with the widespread use of the diphtheria vaccine, even though the vaccine was developed in 1923.[49]  After the 1940s, the prevalence of diphtheria dropped sharply from approximately 19,000 in 1945 to 1 in 1998.[50] The diphtheria vaccine resulted in the eradication of new cases of the disease in the United States. The treatment for those infected, with the addition antibiotics such as penicillin in the late 1940s, remains the same.[51]

In conclusion, the electrocardiogram had widespread use in the medical field as a tool for assessing cardiac function. The utilization of the electrocardiogram in monitoring cardiac condition had profound impacts on the diagnosis and treatment of diphtheria during the early 20th century.  The ECG became the standard for assessing the severity of infected patients, monitoring their progress, determining patients’ prognosis and which medicines could safely be used. The success of the ECG in diphtheria infected patients redefined the ECG as a medical technology for diagnosing infectious diseases as well as cardiac diseases. The ECG is currently used to monitor a wide variety of infectious diseases including HIV, rubella, typhoid, rheumatic fever and diphtheria.[52] The electrocardiogram changed medical practices in diphtheria and is still actively used in infectious disease care today.

Bibliography

Primary Sources:

Andersen. “The electrocardiogram in diphtheria” The Lancet (1935) 689.

Begg, Norman “Diphtheritic Myocarditis, an electrocardiographic study” The Lancet (1937).

Harries, E. “Auxiliary treatment of toxic diphtheria” The Lancet (1932).

“Heart Diagnosis in British Army.” New York Times (1916).

Hoskin, Jenner. “The after effects of diphtheria on the heart.” The Lancet (1926), 1141-1143.

Leete, Mason. “The Heart in Diphtheria.” The Lancet (1938).

McClanahan, H. “Treatment of Diphtheria.” The Lancet (1899).

Simpli-Scribe EKG, Wangensteen historical library collection. University of Minnesota- Twin Cities.

“Sure way to finding out how sick or well you are by measuring electricity in your body” The Washington Post (1914).

Williams, James. “Electrocardiogram in Clinical Medicine.” American Journal of the Medical Sciences (1910) 644-668.

Williamson, Bruce. “The rational use of Digitalis.” The Lancet (1928).

Secondary Sources:

Anatomy of the Heart National Heart, Lung and Blood Institute, National Institute of Health (November 2011), accessed November 22, 2015.

Barold, S.S. “Willem Einthoven and the birth of clinical electrocardiography a hundred years ago.” Cardiac Electrophysiology Review (2003): 99-104.

“Definition of Penicillin History” MedicineNet (2012).

Diphtheria Centers for Disease Control (2013).

“Historical National Population Estimates.” United States Census Bureau (2000), accessed November 22, 2015.

Howell, Joel, D. Technology in the Hospital, Transforming patient care in the early twentieth century (Baltimore: Johns Hopkins University Press, 1995), 9

Nalmas, Sandhya. “ Electrocardiographic Changes in  Infectious Diseases” Hospital Physician (2007), 3.

“Simpli-Scribe” Edward Hand Medical Heritage Foundation, accessed November 22, 2015.

Vaccine, Diphtheria Center for Disease Control, accessed November 22, 2015.

 

Endnotes

[1] “Anatomy of the Heart” National Heart, Lung and Blood Institute National Institute of Health (November 2011) accessed November 22, 2015.

[2] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[3] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[4] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[5] S.S. Barold, “Willem Einthoven and the birth of clinical electrocardiography a hundred years ago.” Cardiac Electrophysiology Review (2003), 99-104.

[6] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[7]  “Simpli-Scribe.” Edward Hand Medical Heritage Foundation, accessed November 22, 2015.

[8] “Simpli-Scribe.” Edward Hand Medical Heritage Foundation

[9]  Simpli-Scribe EKG, Wangensteen historical library collection. University of Minnesota- Twin Cities.

[10] Simpli-Scribe EKG, Wangensteen historical library collection.

[11] “Simpli-Scribe.” Edward Hand Medical Heritage Foundation

[12] “Simpli-Scribe.” Edward Hand Medical Heritage Foundation

[13] “Sure way to finding out how sick or well you are by measuring electricity in your body” The Washington Post (1914)

[14] James Williams, “Electrocardiogram in Clinical Medicine.” American Journal of the Medical Sciences (1910), 644-668.

[15] “Heart diagnosis in British army.” New York Times (1916), 21.

[16] Barold, “Willem Einthoven”, 99-104.

[17] Diphtheria Centers for Disease Control (2013) accessed November 22, 2015.

[18] Diphtheria

[19] Diphtheria

[20] Diphtheria

[21] Diphtheria

[22] Diphtheria

[23] “Historical National Population Estimates.” United States Census Bureau (2000), accessed November 22, 2015.

[24] Diphtheria

[25] Diphtheria

[26] Jenner Hoskin, “The after effects of diphtheria on the heart.” The Lancet (1926), 1141.

[27] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[28] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[29] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[30] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[31]  Mason Leete, “The Heart in Diphtheria.” The Lancet (1938)

[32]  Leete, “The Heart in Diphtheria.”

[33]  Leete, “The Heart in Diphtheria.”

[34]  Leete, “The Heart in Diphtheria.”

[35]  Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[36] Hoskin, “The after effects of diphtheria on the heart.”, 1143.

[37] “The Heart in Diphtheria.”

[38] “The Heart in Diphtheria.”

[39] “The Heart in Diphtheria.”

[40] E. Harries, “Auxiliary treatment of toxic diphtheria” The Lancet (1932).

[41] Andersen “The electrocardiogram in diphtheria” The Lancet (1935), 689.

[42]Norman, Begg. “Diphtheritic Myocarditis, an electrocardiographic study” The Lancet (1937).

[43] H. McClanahan, “Treatment of Diphtheria.” The Lancet (1899).

[44] McClanahan, “Treatment of Diphtheria.”

[45] Bruce Williamson, “The rational use of Digitalis.” The Lancet (1928).

[46] Williamson, “The rational use of Digitalis.”

[47] Williamson, “The rational use of Digitalis.”

[48]  “The Heart in Diphtheria.”

[49] Vaccines Center for Disease Control, accessed November 22, 2015.

[50] Vaccines

[51] “Definition of Penicillin History” MedicineNet (2012), accessed November 22, 2015.

[52] Sandhya Nalmas, “ Electrocardiographic Changes in  Infectious Diseases” Hospital Physician (2007), 3.

UPDATE on Teaching History of Medicine with Artifacts and Oral Histories

You can now access Teaching Medical History with Primary Sources: Introduction by Dominique A. Tobbell and Teaching with Artifacts and Special Collections by Lois Hendrickson on Project Muse, and read the full articles in the Bulletin of the History of Medicine, Volume 90, Number 1, Spring 2016.  The syllabi for classes mentioned in these essays can be found in the BHM syllabus archive.

index

Undergraduate students in the Wangensteen Historical Library’s reading room looking at a table with artifacts.