Recommended Dose

Who Knows Anything? – Journalism, Caesarean Section, and the Production of Knowledge — How Did We Get Into This Mess?

The New York Times ran a story about an amazing c-section survival in 1337. But historians of medieval medicine don’t think it happened. By Monica H. GreenOn Wednesday, 23 November 2016—the day before the Thanksgiving holiday in the U.S.—the New York Times ran what it likely assumed to be a “fun fact” story, a minor historical…

via Who Knows Anything? – Journalism, Caesarean Section, and the Production of Knowledge — How Did We Get Into This Mess?

London’s Pulse: Life and Health in Modern London

This past semester in my history of medicine class (HSCI 3423 at the University of Oklahoma), I asked students to design and execute a research project based on the on-line resource London’s Pulse: Medical Officer of Health Reports 1848-1972, one of the digital collections of the Wellcome Library.  This was a group research project, the second of two group research projects in this class.  There were 6 groups, each with 8-9 students.  For the first research project they had to write a collaborative paper of 15 pages with footnotes and bibliography. For this project, I thought it would be more interesting to have them construct a class website with the results of their research, and they created the London’s Pulse Projects site.

L0003051 A cholera patient experimenting with remedies. Coloured

A cholera patient experimenting with remedies. (1832) Wellcome Images.

My goal was to get them to work with primary and secondary sources to construct a historical narrative. I say “narrative” rather than “argument” intentionally.  Some of them actually DID uncover things that were somewhat unexpected and could serve as the basis for more in depth and advanced research projects. In general, I don’t expect students at this level (many taking their first and only history of science or medicine class) to be able to know enough about the historical literature to judge whether or not they are making an original contribution to that literature. At least, I expect they need considerably more guidance to do this. I have another couple of assignments where I work with them on how to make a historical argument.  With this assignment I wanted to encourage students to dig around in an extensive set of primary sources and find things that struck them as interesting.  Further, because this was a website, not a formal research paper, I encouraged students to write in a somewhat more relaxed and informal style, more suited to a broad general audience. In terms of content, this assignment also fit with one of the major themes of the course which is health inequalities, and the importance of the social determinants of health.

The stages of the assignment were:

Explore the database (1 class day plus an on-line discussion). On the first class period of this assignment, I broke each group into 3 subgroups. I asked each subgroup to search the MOH reports for the following keywords: cholera, rats, and fog. I advised them that each of these terms would pull up hundred of records, and recommended that they see what happened if they limited to a certain year or set of years, or to a borough or set of boroughs. Each subgroup was to scan through the reports and note at least THREE trends or things that struck them as interesting or surprising. They were then asked to pick ONE MOH report with their search term and read it more carefully. They had to explain why they picked this particular report, write a brief description of what it said about the search term, and share it with the rest of your group. Once they had discussed the search terms in their groups, all the subgroups shared their findings with the entire class. In this way, they could see certain patterns emerging, but also anomalies and differences introduced by the use of different limiting search terms.

After the first in class discussion, I asked all students to read the “Health of London Timeline” on the London’s Pulse website and get familiar with the key events in the history of public health in modern London. In addition, each student picked one blog post about “London’s Pulse” from the Wellcome Library’s blog. These posts are by scholars explaining how they used the information in the MOH reports in their research. I thought this would give students some idea of the kinds of questions and problems that the historical data in London’s Pulse can answer. Each student read one of these posts and described it in about a paragraph on an on-line discussion with their group members. In addition to their original post, each student responded to at least two other students’ posts.

Choose a topic (1 class day). In the second class devoted to this project, each group had to come up with three keywords of their own to search.  If they found one of my key words particularly interesting, they could use it again, but they had to come up with two more. I gave them some suggestions, but urged them to get creative. My suggestions included: a disease (e.g. syphilis), a type of illness (e.g. diarrhea, infant mortality), an institution (e.g. workhouse, factory, hospital), a time period (e.g World War I), weather (e.g. fog, rain), a public health measure (e.g. immunizations). Once again, they broke into three subgroups and each searched the MOH reports for the keywords. At the end of this class they had to decide which of their keywords would make the most interesting topic for the group project. A few groups needed another day to figure this out, but most came up with a general topic by the end of the class period.

L0046484 Workhouse for 300 paupers - ground plan

Workhouse for 300 paupers – ground plan. From Annual report of the Poor Law Commissioners for England and Wales. (1835-1847). Wellcome Images.

I set up the website (using WordPress) with the six topics they close: child labor, workhouses, pollution, World War II, infant and child mortality, and cholera.

I set up front pages and associated pages for each topic.  I gave every student access to the site so they could edit the site. Students added content, both text and images, to each of the pages.

Find sources and make a work plan (1 class day plus an on-line discussion). I asked students to assemble a bibliography of primary and secondary sources, including MOH reports; and to divide the project into subtopics that one or two students could work on. Each person could contribute something separate, or small groups of 2-3 could collaborate.

Writing (about 3 class days were devoted to writing and editing pages). Each student was required to write 700-1000 words and to use 5 sources (ideally a mix of primary and secondary).

I read through drafts of the pages and made comments. The site remained private until I had graded everything. I corrected grammar and spelling as I read through the final version of each page. I removed images (and sometimes replaced them with others) if they were under copyright.

Things that worked:

  • Many students got really enthused by this project and wrote very thorough, well-researched pages. They wrote longer pages and used more sources than were required.
  • All groups devised interesting projects and did a great job dividing up the work equally.  The fact that each of them was responsible for an individual page alleviated concerns that stronger students were expected to “carry” weaker students (a perennial problem with group work), but the fact that they were working as groups meant that their project was larger and more in depth than an individual research project. They were also able to share sources.
  • Some students made really interesting connections that I didn’t expect. For example, Danya Majeed juxtaposed William Cadogan’s advice on child rearing with the regimen for children in workhouses. While Cadogan’s text was meant for wealthy families, his advice about not “coddling” children and the need to “toughen them up” arguably informed the draconian regimes in workhouses and other places where children were institutionalized. Tracy Turner’s discussion of infant nutrition in the project on Infant and Child Mortality shows how Medical Officers of Health felt they had to educate poor mothers in the proper care and feeding of their children.  There is some acknowledgment of the constraints these women faced in bringing up their children, but considerably more blame placed on them than sympathy. Jenna McGrath draws trenchant parallels between sweatshop labor in Victorian Britain and today.
V0010880 A man covering his mouth with a handkerchief, walking throug

A man covering his mouth with a handkerchief, walking through a smoggy London street. Wellcome Images.

Things that didn’t go so well:

  • Some pages are quite superficial.  I made clear that this website would be public, and that other members of the class would read it, so I assumed since people would not want to display shoddy work publicly, but I was mistaken.
  • Many students did not make as much use of the MOH reports as I would have liked. (They bring in one or two in to illustrate a point they found in the secondary literature. They don’t bring in secondary literature to understand what they read in the MOH reports.)
  • Despite what I thought were pretty extensive discussions in class on how to find scholarly secondary sources in databases like the “History of Science, Technology and Medicine” and “Historical Abstracts,” some students relied heavily on sources they found through Google.
  • I needed to give students more instruction than I did in using WordPress. It’s not that difficult, but there were a few hiccups (e.g. one student inadvertently deleted another student’s work). I don’t feel like I really gave them the skills to blog independently, and I’d like to do that next time.

 

The Sphygmomanometer and its Impact on Clinical Practice

The following post is a paper written by Haylie Helms (@HaylieHelms), a junior majoring in Biology, Society, and Environment at the University of Minnesota, in the spring semester of 2015 for Dominique Tobbell’s class HMED 3075.  In a recently published pair of articles in the Bulletin of the History of Medicine, Dominique Tobbell and Lois Hendrickson described  their use of historical artifacts (from the Wangensteen Historical Library) in their history of medicine courses.  Ms. Helm’s paper is the third of three papers offered as examples of the work students have done in their classes.

Introduction

Blood, and its circulation throughout the body, has been studied for thousands of years. The earliest recorded writings about the circulatory system can be found in the Ebers Papyrus, an ancient Egyptian manuscript, dating back to 1500 BCE.[1] The Egyptians acknowledged the presence of mtw, which can be roughly translated to vessels that transport blood and nutrients throughout the body.[2] The understanding of how the blood circulated remained a highly debated topic throughout much of the 17th century. Although William Harvey, an English physician, is credited with the discovery of blood circulation in 1628, most physicians of the time believed that the lungs were responsible for moving the blood.[3] Once the connection between heart rate and pulse was discovered, it was then possible to determine blood volume and blood pressure.[4] Blood pressure was measured for the first time by Stephen Hales in 1733.[5] Hales placed a brass tube into the crural artery of a mare and connected to it a glass tube that was nine feet long.[6] By calculating how high the blood rose, Hales was able to calculate the blood pressure. It was not until nearly a century later that blood pressure was accurately studied. Early methods of studying blood pressure in humans followed the same technique.  The first noninvasive blood pressure measurement tool, called a sphygmomanometer, was invented in 1881 by Samuel Siegfried Karl Ritter Von Basch.[7]

I will examine the introduction of the sphygmomanometer into medical practice using Roger’s sphygmomanometer from the Wangensteen Historical Library of Medicine. In particular, upon the invention of the sphygmomanometer, how were the physicians’ understanding of heart health altered? In what ways did the invention of the sphygmomanometer impact clinical diagnoses? Lastly, how did the user of the sphygmomanometer change from when it was first introduced in practice in 1881 until the 1930s? It is hypothesized that the invention of the sphygmomanometer alone did not change the physician’s understanding of heart health. It was in unison with many other areas of research that the field of cardiology and pathology progressed. As a result of the progression of clinical diagnostics, the user shifted from the scientist to health promotion companies. No longer was it solely the physician who ordered for blood pressure to be measured; the general public was requesting it as well. Therefore, the invention of the sphygmomanometer set the stage for a deeper understanding of heart anatomy and disease for both physicians and the general public alike.

The Von Basch Sphygmomanometer

Early methods for measuring blood pressure required glass tubes filled with mercury to be inserted into the artery of the patient. The invasiveness of the procedure limited the feasibility of these devices to be used as a diagnostic tool. Von Basch’s 1881 noninvasive sphygmomanometer, however, used a rubber ball that was placed over the radial artery to suppress the pulse.[8] The rubber ball was filled with water and connected to a mercury tube.[9] When the pulse was no longer felt, the reading on the mercury tube indicated the systolic blood pressure.[10][11]

M0017687 Samuel Siegfried von Basch: Sphygmomanometer

Figure 1: Samuel Siegfried von Basch: Sphygmomanometer. Wood engraving Down’s surgical instrument catalogue Published: 1906. Welcome Library.

 

Von Basch went on to measure 100,000 patients’ blood pressure over the span of ten years using his sphygmomanometer.[12]  He concluded that the normal blood pressure was between 135 and 165 mmHg.[13] He also noted instances when patient’s blood pressures were abnormal and their current symptoms or diagnosis. Through his work, Von Basch identified the equivalent of today’s hypertension, which he called latent atherosclerosis, and cardiac hypertrophy.[14]

Although the Von Basch sphygmomanometer was safe for patients and provided accurate clinical data, it was not widely used by physicians.[15] Dr. Scipione Riva-Rocci stated, “It is not surprising that despite the many persistent attempts to introduce sphygmomanometry into medical practice, this has remained nothing more than a luxury measurement or an unusual investigation.”[16] Most physicians of the time preferred old techniques such as using the pressure from their hands to restrict flow.[17] The British Medical Journal published their view that sphygmomanometers “pauperizes the senses and weakens clinical acuity.”[18]  It was not just the sphygmomanometer that was under scrutiny; many physicians and scientists of the time were opposed to many new technologies at the turn of the century. Upon the introduction of the x-ray machine into clinical practice in 1895, physicians preferred to use their hands to make diagnosis.[19]

The Introduction of the Modern Blood Pressure

In 1896 Dr. Scipione Riva-Rocci published his method of measuring blood pressure.[23] Riva-Rocci’s method placed a 5 cm band around the patient’s arm and inflated it using a bulb filled with air.[24] The cuff was inflated until the pulse was no longer detected; like the Von Basch sphygmomanometer, the value recorded was the systolic pressure.

sphyg2

Figure 2: Riva-Rocci-type sphygmomanometer, originally developed in 1896.[25] Wangensteen Historical Library of Medicine.

In 1901 Von Recklinghausen found a crucial flaw in Riva-Rocci’s system; the band was too narrow.[26] This resulted in an acute angle that would form between the cuff and the skin. Coincidentally, it caused local areas of high pressure buildup, which skewed the pressure reading.[27] Von Recklinghausen fixed the problem by simply widened the band to 12 cm.[28] The sphygmomanometer with the wider band provided accurate and safe blood pressure readings that could be used for clinical diagnostics and research. However, this method only allowed for the measurement of the systolic blood pressure and not the diastolic pressure.

In order to measure both the systolic and diastolic pressure, the oscillometric technique, which was created in 1876, was paired with the sphygmomanometer.[29] Under the oscillatory blood pressure method, the user would use the Riva-Rocci method while watching the oscillations transmitted to the mercury in a manometer.[30] When the cuff pressure was equal to the arterial pressure the compressed artery would throb causing small fluctuations in the cuff pressure.[31] The fluctuations would transition from small to large oscillations identifying the diastolic pressure.[32]

The Riva-Rocci method plus the oscillatory method would be modified once more to acquire the basic method of blood pressure measurement seen today. In 1905 Russian surgeon Dr. Nikolai Korotkoff identified the difference in sound between the systolic and diastolic pressures.[33] Using a stethoscope, Dr. Korotkoff was able to hear tapping sounds, which he explained as the blood flowing back into the artery.[34] Known as the Korotkoff sound, the slight difference in the way blood pressure was recorded changed the way physicians viewed the device. Initially, many physicians were opposed to sphygmomanometers because they believed it took away their reliance on their senses and weakened clinical acuity. By requiring the physicians to listen for the sound, it brought back the prestige of the method since only those trained could properly acquire and interpret data. The Korotkoff sound set the stage for future cardiologists to uncover the underlying pathology.[35]

The Impact of the Sphygmomanometer

Physicians in the late 1890s understood the effects of the vessels and heart on blood pressure: blood pressure is controlled by the constriction and dilation of blood vessels and the frequency and stroke volume of the heart.[36]  Riva-Riccoi warned physicians of the usefulness of the sphygmomanometer during clinical diagnosis.

Therefore, if all aspects of the problem were like this [multiple factors affecting the blood pressure], and only like this, sphygmomanometry would not have any clinical applications. The data it supplied would only give us abstract information of purely academic interest, but nothing that could be used on patients or for learning the course of a morbid process.[37]

Many researchers studied the usefulness of blood pressure readings over the next decades in order to uncover its usefulness. Common studies included finding trends associated with patient size, temperature, position (sitting, standing, and laying), age, occupation, diet, sleep, time of day, alcohol and tobacco, mental state, exercise, external temperature, atmospheric pressure, and menstruation.[38] Despite all of the research conducted, the data sphygmomanometers collected was still not very useful in clinical diagnostics for pathological diseases.

Like all current knowledge of symptoms and disease, discoveries were made based on repeated exposure and documentation. When patients had abnormal blood pressures that could not be explained, physicians documented their symptoms and the course of their illness. For example, arterio-sclerosis was diagnosed with the aid of the sphygmomanometer and the observation of thickening of superficial arteries, signs of enlarged left ventricle, and the “ringing aortic second sound.”[39]

As time went on, and more correlations were found between symptoms, vitals, and diagnoses, physicians began to uncover the usefulness of the sphygmomanometer in clinical practice. In 1910 Dr. Janeway argued that the sphygmomanometer was most valued in diagnosing hypertension.[40] Physicians understood the effects and dangers of hypertension. Dr. Janeway wrote,

Hypertension is not merely a symptom of diagnostic and prognostic value, nor is it to be considered only as an effect of causes acting on the heart and vessels. It is of itself a source of altered function throughout the circulatory system, which leads to further secondary changes. These cannot in all cases be clearly separated from the primary changes producing the high pressure, but they may frequently be distinguished anatomically as well as theoretically.[41]

As compared to an x-ray machine that can be used as the sole diagnostic tool when determining if a bone is broken, the sphygmomanometer alone cannot be used to diagnose a patient. Mental state, temperature, diet, exercise, and sleep all must be considered as physicians understood the effect they can have on blood pressure.[42] As displayed in Dr. Janeway’s writings, upon the introduction of the sphygmomanometer into clinical practice, the physician’s understanding of heart pathology was further developed. The device itself did not change the physicians understanding of heart health; it set the foundation for further medical discoveries.

The Evolution of the User

The sphygmomanometer is a technology closely related to the thermometer’s history of invention, development of knowledge, introduction to practice, and evolution of the user.[43] The sphygmomanometer was initially sold to the physician or clinical researchers with the mindset that the sphygmomanometer could only be operated by someone with extensive training in the sciences.[44] It was believed that only those with a scientific background could properly acquire precise data and interpret the results. The device itself was not hard to use, however. As one physician wrote, “after all, no mysterious nor difficult [technique] is involved in sphygmomanometry, as the study of blood pressure may be correctly termed.”[45] Nonetheless, the prestige associated with the user upon the invention of the sphygmomanometer made it acceptable for only those who were trained extensively in the sciences, such as physicians or researchers, to operate it.

As physicians became more aware of the associations between blood pressure and heart pathology, there was an increase in demand for the sphygmomanometer. Blood pressure, measured by the sphygmomanometer, became a standard vital taken on patients by the 1910s.[46] Dr. Satterthwaite wrote, “No physical examination is complete without a record of the blood pressure. It is also very helpful in the diagnosis and management of cardiovascular and renal diseases and toxemias.”[47] In addition, heart health became a topic of discussion in local newspapers. Columns written by doctors explained what blood pressure is, how it is measured, and the importance of monitoring it.[48] Ads urged patients to be more aware of their heart health and to actively monitor their blood pressure. “Your high blood pressure can be measured. The sphygmomanometer registers it with absolute accuracy.”[49] Other ads in the newspaper used fear to urge patients to be more aware of their blood pressure: “He closed the door behind him and walked down the stairs in a kind of a daze, the doctor’s words ringing in his ears: ‘High blood pressure.’  ‘You may die any time – you can’t live over three years.’”[50]  “Arterio-sclerosis. Recent knowledge of the disease from which Paul Morton died.”[51]

As the demand for the sphygmomanometer increased at a rapid rate, physicians were inundated with blood pressure requests. Like the thermometer, physicians began to realize that measuring blood pressure was tedious and repetitive.[52] To ease the strain on physicians, nurses were needed to use the sphygmomanometer; the modern hospital depended on the invention of the ‘thinking nurse.’[53] As Margarete Sandelowski outlined, the ‘thinking nurse’ was necessary for quantification of clinical signs and symptoms.[54] Nurses were needed to carry out physicians’ orders and have knowledge to record, interpret, and report information vital to diagnosis and treatment.[55]

The user evolved once more; after the nurse was qualified to operate the sphygmomanometer, health promotions companies began utilizing the sphygmomanometer. Health insurance companies utilized the sphygmomanometer to promote healthy living and screen future customers for life threatening diseases. Some companies required a blood pressure evaluation prior to granting insurance coverage.[56] Other companies used the sphygmomanometer to attract customers: “For your own sake and for the sake of those you love and who are dependent on you, you should investigate the Witter Water Treatment.”[57] “When the sphygmomanometer, before your eyes, shows that your pressure has been reduced, there is no chance of error. We get, in the majority of cases, a marked reduction in pressure after one treatment.”[58] Although treatments such as the Witter Water Treatment were not scientifically proven, and likely ineffective, they brought the sphygmomanometer to the attention of the public. Patients were now approaching their physician for blood pressure measurements. These actions by patients, and outside organizations, aided in the shift of medicine at the turn of the century to a scientific, evidenced based, system.[59]

Conclusion

The invention of the sphygmomanometer alone did not change the physician’s understanding of heart health. It was in unison with many other areas of research that the field of cardiology and pathology progressed. The work of many scientists, including but not limited to Von Basch, Riva-Rocci, and Korotkoff, laid the foundation for future cardiologists and pathologists. Symptoms were recorded in addition to blood pressure readings to properly diagnose patients. Initially the blood pressure readings could only be obtained by the physicians as there was a belief that the physician was the only one scientifically inclined enough to record and interpret the data. However, as blood pressure became a standard of practice, and patients began to request blood pressure readings, the physicians were inundated with the tedious task of testing the blood pressure. To alleviate the strain on the physicians, nurses were trained to take the blood pressure. The ‘thinking nurse’ led to the increased efficiency in the hospital and a more scientific based approach in diagnostics. In addition, the routine use of the sphygmomanometer led to a shift in knowledge surrounding heart anatomy and disease from physician to public. Newspapers ran advertisements to inform the public of the importance of getting regular blood pressure readings. Other organizations, such as insurance companies, understood the dangers of high blood pressure and required customers to receive a blood pressure test before they could receive insurance. Prior to the invention of the sphygmomanometer routine tests in the clinical setting were not possible due to the invasiveness of the procedure. Therefore, the invention of the sphygmomanometer set the stage for a deeper understanding of heart anatomy and disease for both physicians and the general public alike while aiding in the shift of the hospital to a greater scientific approach in the turn of the century.

Bibliography

“Display Ad 11.” Los Angeles Times, August 4, 1923.

“Display Ad 451.” Los Angeles Times, February 4, 1923.

American Diagnostic Organization. “History of the Sphygmomanometer.” Accessed November 11, 2015.

Barr, Justin. “Vascular Medicine and Surgery in Ancient Egypt.” Journal of Vascular Surgery 60, no. 1 (2014): 260-263.

Booth, Jeremy A. “A Short History of Blood Pressure Measurement.” Proceedings of the Royal Society of Medicine 70, no. 11 (1977): 793-799.

Detroit Pharmaceutical Co., Catalogue of Physician’s Supplies: Including drugs and chemicals, dispensing supplies, pharmaceuticals, surgical instruments, electric apparatus, trusses and appliances Michigan: Aldine Printing Works, 1894.

Evans, Dr. W.A. “How to Keep Well: Blood Pressure.” Chicago Daily Tribune, June 28, 1914.

Faught, F.A. Blood-Pressure Prime. Philadelphia: .P. Philling & Son, 1914.

Howell, Joel. Technology in the Hospital: Transforming Patient Care in the Early Twentieth Century. Baltimore: Johns Hopkins University Press, 1995.

Janeway M.D., Theodore Caldwell. The Clinical Study of Blood Pressure. New York: D. Appleton and Company, 1910.

Kotchen, Theodore A. “Historical Trends and Milestone in Hypertension Research: A Model of the Process of Translational Research.” Journal of Hypertension 58 (2011): 522-538.

Middleton, Dr. William S. “Blood Pressure Determination: A Nursing Procedure.” The American Journal of Nursing 30, no. 10 (1930): 1219-1225.

Noyes, Bradford. “The History of the Thermometer and the Sphygmomanometer.” Bulletin of the Medical Library Association 24, no. 3 (1936): 155-165.

Ogedegbe, Gbenga. “Principles and Techniques of Blood Pressure Measurement,” Cardiology Clinics 28, no 4. (2010): 571–586.

Riva-Rocci, Dr. Scipione. “A New Sphygmomanometer.” Gazzetta Medica di Torino 47, no. 50 (1896): 981-996.

Sandelowski, Margarete Devices and Desires: Gender, Technology, and American Nursing. Chapel Hill: The University of North Carolina Press, 2000. 21-43

Satterthwait, Dr. Thomas E. Cardio-vascular Diseases: Recent advances in their physiology, diagnosis, and treatment. New York City: Lemcke and Buechner, 1913.

Science Museum Brought to Life, Exploring the History of Medicine. “Bloch Type Sphygmomanometer, Paris, France, 1881-1913.” Accessed December 7, 2015.

Soto-Perez-de-Celis, Enrique. “Karl Samuel Ritter Von Basch: the Sphygmomanometer and the Empire.” Journal of Hypertension 25, no. 7 (2007): 1507-1509.

Tracy, Dr. S.G. “Arterio-Sclerosis: Recent Knowledge of the Disease from which Paul Morton Died.” The Washington Post, January 25, 1911.

 Endnotes

[1] Justin Barr “Vascular Medicine and Surgery in Ancient Egypt.” Journal of Vascular Surgery 60, no. 1 (2014): 260.

[2] Barr, “Vascular Medicine and Surgery in Ancient Egypt,” 261.

[3] Jeremy A Booth “A Short History of Blood Pressure Measurement.” Proceedings of the Royal Society of Medicine 70, no. 11 (1977): 793.

[4] American Diagnostic Organization. “History of the Sphygmomanometer.” (accessed November 11, 2015).

[5] Booth, “A Short History,” 794.

[6] Booth, “A Short History,” 794.

[7] Enrique Soto-Perez-de-Celis.”Karl Samuel Ritter Von Basch: the Sphygmomanometer and the Empire.” Journal of Hypertension 25, no. 7 (2007): 1507.

[8] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1507.

[9] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1508.

[10] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1508.

[11] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1508.

[12] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1508.

[13] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1508.

[14] Soto-Perez-De-Celis, “Karl Samuel Ritter Von Basch,” 1508.

[15] Theodore A. Kotchen, “Historical Trends and Milestone in Hypertension Research: A Model of the Process of Translational Research.” Journal of Hypertension 58 (2011): 522.

[16] Dr. Scipione Riva-Rocci, “A New Sphygmomanometer.” Gazzetta Medica di Torino 47, no. 50 (1896): 985.

[17] Science Museum Brought to Life, Exploring the History of Medicine. “Bloch Type Sphygmomanometer, Paris, France, 1881-1913.” (accessed December 7, 2015).

[18] Kotchen, “Historical Trends and Milestone in Hypertension Research,” 522.

[19] Joel Howell, Technology in the Hospital: Transforming Patient Care in the Early Twentieth Century, (Baltimore: Johns Hopkins University Press, 1995), 103-132.

[20] Riva-Rocci, “A New Sphygmomanometer,” 984.

[21] Riva-Rocci, “A New Sphygmomanometer,” 983-984.

[22] Riva-Rocci, “A New Sphygmomanometer,” 989.

[23] Booth, “A Short History,” 797.

[24] Riva-Rocci, “A New Sphygmomanometer,” 985.

[25] Riva-Rocci, “A New Sphygmomanometer,” 985.

[26] Booth, “A Short History,” 797-798.

[27] Booth, “A Short History,” 797-798.

[28] Booth, “A Short History,” 798.

[29] Gbenga Ogedegbe, “Principles and Techniques of Blood Pressure Measurement,” Cardiology Clinics 28, no 4. (2010): 572.

[30] Booth, “A Short History,” 798.

[31] Booth, “A Short History,” 798.

[32] Booth, “A Short History,” 798.

[33] Booth, “A Short History,” 798

[34] Booth, “A Short History,” 798

[35] Booth, “A Short History,” 798.

[36] Riva-Rocci, “A New Sphygmomanometer,” 989.

[37] Riva-Rocci, “A New Sphygmomanometer,” 989.

[38] Theodore Caldwell Janeway M.D., The Clinical Study of Blood Pressure. (New York: D. Appleton and Company, 1910), 108-127.

[39] Janeway, The Clinical Study of Blood Pressure, 143.

[40] Janeway, The Clinical Study of Blood Pressure, 137.

[41] Janeway, The Clinical Study of Blood Pressure, 148.

[42] Janeway, The Clinical Study of Blood Pressure, 108-127.

[43] Bradford Noyes, “The History of the Thermometer and the Sphygmomanometer.” Bulletin of the Medical Library Association 24, no. 3 (1936): 155-165.

[44] Detroit Pharmaceutical Co., Catalogue of Physician’s Supplies: Including drugs and chemicals, dispensing supplies, pharmaceuticals, surgical instruments, electric apparatus, trusses and appliances (Michigan: Aldine Printing Works, 1894).

[45] Dr. William S. Middleton, “Blood Pressure Determination: A Nursing Procedure.” The American Journal of Nursing 30, no. 10 (1930): 1219.

[46] Dr. Thomas E. Satterthwait. Cardio-vascular Diseases: Recent advances in their physiology, diagnosis, and treatment. (New York City: Lemcke and Buechner, 1913), 40.

[47] Satterthwait, Cardio-vascular Disease, 40.

[48] Dr. W.A. Evans, “How to Keep Well: Blood Pressure.” Chicago Daily Tribune, June 28, 1914, A4.

[49] “Display Ad 11.” Los Angeles Times, August 4, 1923, 7.

[50] “Display Ad 451.” Los Angeles Times, February 4, 1923, X123.

[51] Dr. S.G. Tracy. “Arterio-Sclerosis: Recent Knowledge of the Disease from which Paul Morton Died.” The Washington Post, January 25, 1911, 6.

[52] Noyes, “The History of the Thermometer and Sphygmomanometer,” 155-165.

[53] Margarete Sandelowski, Devices and Desires: Gender, Technology, and American Nursing (Chapel Hill: The University of North Carolina Press, 2000), 21-43.

[54] Sandelowski, Devices and Desires, 21-43.

[55] Sandelowski, Devices and Desires, 21-43.

[56] F.A. Faught, Blood-Pressure Prime. (Philadelphia: G.P. Philling & Son. 1914).

[57] “Display Ad 451,” X123.

[58] “Display Ad 11,” 7.

[59] Howell, Technology in Modern America, 30-68.

A Beautiful Technology: The Lost Art of Triangular Bandaging

The following post is a paper written by Maria Null, a junior majoring in Biology, Society, and Environment at the University of Minnesota, in the spring semester of 2015 for Dominique Tobbell’s class HMED 3075.  In a recently published pair of articles in the Bulletin of the History of Medicine, Dominique Tobbell and Lois Hendrickson described  their use of historical artifacts (from the Wangensteen Historical Library) in their history of medicine courses.  Ms. Null’s paper is the second of three papers offered as examples of the work students have done in their classes.

Bandages have demonstrated power as a medical technology in their many varieties, applications, and restorative health qualities throughout the history of humanity. Bandaging as a ‘practice’ has been characterized as an ‘art’ in the literature of medicine, nursing, and mortuary sciences cross-culturally.  Notably referred to as a “dying art,” early bandaging has been evidenced by the Egyptian practices of mummification, biblical texts, and popular fictional literature.[1] The prevalence of injuries warranting bandaging in war, medicine, and civilian life are interrelated and their treatments informed by the dissemination of knowledge by those who have studied, practiced, and mastered the “art of bandaging.”

This paper will argue that the improvement of the materials of the triangular bandage, its notable efficacy in sustaining life after being wounded, and the dissemination of its associated knowledge for application during the First World War (1914-1918) contributed to its decline in status as a medical technology through reforms in nursing and ‘first-aid’ education.

“Bandage,” is derived from the French term “bande” meaning an article used to secure an injury and to bind it.[2] It is important, for the purposes of this essay to distinguish between two terms most commonly used interchangeably for one another- “dressing” and “bandage.” In this text, “dressings” will be shortly examined only as they are relevant to the application of a “bandage” to bind the dressing to a portion of the human body.  “Bandage” will refer to an apparatus used to either bind a dressing, support a portion of the body, or directly cover a portion of the body during an emergency situation- and in this way, acting as what the reader may mistakenly conceive to be a “dressing.”

The rise of the modern bandage, in its many forms, was coupled by increased knowledge of human anatomy and physiology, which enabled physicians and nurses to designate specific bandages for unique applications to certain portions of the human body. Four unique bandage types have been popularly employed in nursing since the early nineteenth century: the roller bandage, the four-tailed bandage, the scultetus, and the triangular bandage.[3]  Each bandage was used for a specific purpose, however each was credited with the capacity to retain dressings or splints.[4]  Amongst the four popular bandage types, the late 1800s saw the rise of what became known as the ‘triangular bandage.’  In 1831, Swiss surgeon Mathias Mayor was first credited for acknowledging the utility of what he called the ‘handkerchief bandage.’[5]  But the ‘handkerchief bandage’ remained unpopular in medicine until nearly forty years later, when it was used in the field of battle by German surgeon, Professor Johannes Friedrich von Esmarch.[6]  In texts and according to manufacturers, Esmarch’s “discovery” defined the triangular bandage throughout the First World War.  Indeed, he was the first to suggest printing the illustrations for use, which would come to uniquely characterize the triangular bandage.[7]  However, his accomplishment has been contested by medical doctors throughout history due to the leniency with which the triangular bandage may be defined as a technology. [8]

Figure 1. Professor Esmarch’s Bandage with Printed Graphic Illustrating Use

Picture1

Esmarch’s Bandage: The triangular bandage is depicted with elaborate illustration to inform the user of its many applications.[9] Image captured at the Wangensteen Library.

The triangular bandage as a technology is defined by its existence as a physical object, its versatile functions, and the knowledge needed to apply it.[10]  Early nursing texts and supplementary readings suggest the bandage was amorphously fabricated by clothing, linen, pillowcases, handkerchiefs, and bed sheets in dire emergencies. [11] Additional texts inform the physical fabrication and measurements of the technology when bandages were produced to act as a medical technology. Bleached or unbleached muslin or calico, linen, silk, or gauze was utilized in instruction and treatment of injury.[12]  The physical measurements of triangular bandages varied and were determined by user selection and eventually by the manufacturer.  Texts suggest an appropriate median measurement for the triangular bandage of the 1910s was approximately one square yard.[13] The bandage only serviced its many functions upon folding, which was illustrated in numerous nursing texts and additional ‘first-aid’ handbooks.[14]

Figure 2. A Manual of Instruction in Folding the Triangular Bandage

Picture2

Standard illustration with instructional text in aid book.  Figure 59 depicts starting materials, figure 60 the triangular bandage. Figures 61 through 65 depict conversion of the triangular bandage into a compact for storage or transport. Figures 66 and 67 illustrate the folds of the triangular bandage into a ‘cravat’, and subsequent rolling into its ‘cord’ form.[15]

Among its many functions, arresting hemorrhaging was of particular importance in the field of battle- a function requiring extensive knowledge of the many utilities of the triangular bandage for the proper treatment to be achieved. After its introduction to ambulance work in the field by Esmarch in the late 1800s, physicians began producing texts characterized by elaborate illustration and in-depth instruction on triangular bandaging.[16] Understanding texts from the late 1800s through the mid-1920s required knowledge of medical terminology and anatomy, as well as the motivation to learn and practice countless variations on the bandaging techniques.  Knowledge of the most useful materials and the speed, neatness, and proper tension required of the application were integral to the success of the treatment, and often the survival of the patient.  As the U.S. entered into the First World War, few collective groups in America possessed the same drive and commitment to establishing the agency of the triangular bandage overseas as the United States Army Nursing Corps.

Nurses, patients, first responders, and surgeons observed and experienced the impact of the triangular bandage in medical practice. As a bandage with the primary purpose of serving in emergency situations, first responders were typical users. First responders in the field of battle were soldiers of the U.S. army who were equipped with the bandage in early ‘first-aid kits.’ A ‘first-aider’ as defined by Major Charles Lynch of the Medical Corps of the U.S. Army, was any individual intervening during a medical emergency prior to the summoning or arrival of a physician. [17] Field hospitals stocked with nurses were frontlines of ‘first-aid.’  Triage and dressing stations saw casualties, and upon assessment, nurses donned soldiers with more elaborate triangular bandages. Bandages in field hospitals were not themselves any more complex in fabrication than their field ‘first-aid’ counterparts but they were, however, embodied with greater power to treat injury by the nurses with the knowledge and experience of manipulating them.  By 1917 the national curriculum for nursing education had outlined coursework for instruction in the use of the triangular bandage.  Elementary bandaging was instructed throughout five classes over a time span of ten hours.  Instructors or “competent” head-nurses in association with surgery, orthopedics, and first-aid taught classes.  The emphasis on critical manipulation, practice, and demonstration of bandaging skills required speed, efficiency and dexterity of nurses.  Likely, two or fewer hours were spent on instruction of the triangular bandage- which was acknowledged as a first aid utility requiring additional education and training.[18] An additional ten hours of instruction on elementary nursing and first aid focused on preparing nurses to adapt readily to emergency situations, much like those in the army.  The outline of curriculum emphasized the additional training of nurses required of them upon entry into the Army Nursing Corps.[19]

The evolution of the triangular bandage, in its fabrication and dimensions, contributed to its increasingly widespread use, manufacturing, and distribution to civilians.  While the bandage was readily available to any user with the knowledge of textiles, which could be manipulated for bandaging purposes, it was not until the mass production of the ‘Esmarch bandage’ by American consumer healthcare company Johnson & Johnson that civilians saw standard triangular bandages in their very own ‘First-Aid Kits.’  By 1917, Johnson & Johnson was regularly producing a triangular bandage for civilian ‘First-Aid Kits.’[20] The 36 inch squared ‘Esmarch Bandage,’ was a staple of Kits, originally supplied to soldiers as early as the Spanish American war in the late 1800s, when the company had first entered into war-time production efforts.[21] In notes on manufacturing for the American Red Cross, Johnson & Johnson included an “explanation of numbers shown on figures in illustration of the Esmarch Bandage.”[22] The graphic illustration attempted to recreate the iconic print characteristic of Professor Esmarch’s original fabrications.  The widespread development and distribution of the bandage extended the user base and established a market for American consumers of first-aid products.

Figure 3. Johnson & Johnson’s Manufactured “Esmarch Bandage”

Picture3

Johnson & Johnson’s triangular bandage named for Professor Esmarch of Kiel. The bandage mimiced Esmarch’s original design with graphic illustration for the dissemination of knowledge of the bandage’s utilities.[23]

Bandages for purchase were fabricated from surgical gauze or muslin and were available at drug stores.  Not only were they staples of the ‘First-Aid Kit,’ but they were also included in the ‘Johnson’s First Aid Cabinet,’ and the popular packet known as ‘Johnson’s First Aid for Wounds.’[24]  Both were features of aid in the railroad industry, manufacturing establishments, and schools.  The importance of ‘first-aid,’ and responsibility of citizens to learn about it and administer it was demonstrated in Johnson’s First Aid Manual.  The manual depicted improved bandaging materials and denounced the improvisation of triangular bandages; in effect suggesting that efficacy of treatment required the newer and standardized ‘first-aid’ materials supplied by the manufacturer.[25]

Indeed, Johnson & Johnson boasted the efficacy of the triangular bandage suggesting, “It is probable no other system of wound dressing can accomplish so much, and in such a reliable manner, in rendering first aid, as the use of the triangular bandage.[26]  Improved function with the use of the triangular bandage coupled its improved materials.  As the bandage was developed, altered in size to be more compact, and its materials made to be more durable, ‘first-aid’ was rendered more successfully by soldiers, nurses, and civilians. This was demonstrated in part by curriculum change in nursing from 1917 to 1933, which incorporated the new text published by Instructor of Surgery at the University of Pennsylvania, A.D. Whiting: Bandaging.  In his text Whiting outlined the effect of the elimination of gauze from use as a fabric for the triangular bandage. While gauze had been a central fabric to earlier triangular bandages, Whiting proclaimed the improved treatment of injury achieved by bandages made of different fabric.  Whiting suggested that gauze was not sturdy enough to exercise the utilities of the bandage, and rather bleached or unbleached muslin had been proven materials for increasing proper bandaging and sustaining life in emergency situations.[27]  Whiting’s text became the standard for bandaging in nursing curriculum following the First World War and represented the transforming identity of the bandage in several key ways.

While the medical terminology in Whiting’s text maintained the integrity of the triangular bandage as a medical technology within the surgical and nursing professions (including the ‘Occipitofrontal Triangle,’ the ‘Iliofemoral Triangle,’ and the ‘Mentovertico-occipital Cravat), American citizens in the 1910s and 20s were increasingly exposed to new language in popularized ‘first-aid’ education.[28]  Johnson & Johnson not only served as a manufacturer contributing to the physical evolution of the triangular bandage, but they also served as a primary source of education on the procedures of ‘first-aid’ as given with their products.  Johnson’s First Aid Manual, in its many additions, used language contradictory to that of Whiting’s and other medical texts in the early 1910s.  A 1917 edition of Johnson’s explicitly stated, “no instructions are given in this Manual in respect to anatomy or physiology. A knowledge of these subjects is not deemed essential either to the intelligent use of the manual or the application of first aid.”[29]  The certainty with which the Manual discerned  “non-essential” features of medicine in ‘first-aid’ contributed to a shifting paradigm in U.S. ‘first-aid’ culture.  With the improvement of the bandage nearing the end of the First World War, manufacturers, educators, and the American Red Cross began to simplify and consolidate aid education to promote the identity of the ‘first-aider.’

In early texts, the complexity of the triangular bandage as a technology is evident. While the triangular bandage was free in form, it was manipulated by adhering to the principles of geometry. Rudimentary knowledge of the subject was required of users who were instructed to fold the triangular bandage relative to its features: bases, sides, apexes, squares, quadrilaterals, triangles within larger triangles, extremities and ends, and angles of the triangle.[30] ‘Broad’ and ‘narrow’ folds of the bandage and its shape as a ‘cravat’ and ‘cord’ classified its many configurations.[31] Efficacy of the treatment and preservation of life required knowledge of each of these features.  Additionally, knowledge of how and where to apply a bandage, and with certain pressure, was essential. A majority of texts alluded to the error, which would result from applying a triangular bandage ineffectively, resulting in the continuity of hemorrhaging or arresting of circulation. While it was often assumed that any attempt at immediate aid (erroneous or informed) would increase chances of survival, texts and aid books asserted that an improperly applied bandage could just as likely harm a patient as hurt them.[32] Extensive study, and practice manipulating the triangular bandage was required of nurses to master the ‘art of bandaging.’ While acknowledging the excessive skill possessed by nurses who “perfected” the “art” of application of the triangular bandage in ‘first-aid,’ virtually no texts awarded them with prestige.  This reflects the position of the status in the historical context- nurses characterized by tasks of manual dexterity requiring little subjective analysis of procedures.[33] The bandage itself, in contrast, had reached a new status as an efficacious medical technology as its ‘first-aid’ properties were realized by American civilians following the First World War.

The distribution of knowledge of the bandage to civilians was characterized by a change in the language of texts, illustrations and photographs and demonstrations, and conceptualization of the status of the bandage as a technology.  From descriptions of nearly twenty specialized triangular bandages in medical texts such as Whiting’s, to only eight in First Aid in Emergencies, the triangular bandage experienced a reduction in its versatility once adopted by the civilian ‘first-aider.’[34] Fewer varieties of the triangular bandage were included in texts intended for civilian education in aid. Texts, which once required understanding of medical terminology, anatomy, and geometry, were modified for the American consumer.  While highly descriptive texts instructed bandaging in the late 1800s, the 1910s saw the efficient integration of photographs with fewer lengthy text inserts.  Willing civilians could learn by following step-by-step depictions of bandaging, featured in numerous publications at the time. In her Illustrations of Bandaging and First-Aid, registered nurse Lois Oakes produced knowledge for the public eye- once reserved for the production and consumption by surgeons and nurses.  Oakes’ publication thoroughly depicted the many functions of the triangular bandage, attempting to preserve its status as a complex technology. The illustrations and review by the American Journal of Nursing, however, reflected the declining status of the bandage and the popular assumption that any willing individual could become skilled in administering aid.[35]  The journal’s review proclaimed Oakes illustrated the use of the bandage “so plainly that even an inexperienced person could study them with advantage.”[36] While not declaring the civilian’s capacity to supersede the skills of a trained nurse, the journal was suggesting that the civilian could acquire the skills once reserved for nurses. Additionally, demonstrations and lectures on first-aid, known as “ambulance” work, became popularized with the public through the American Red Cross several years into the First World War.  A 1917 article in the Washington Post reflected varieties of newspaper clips from the time period: lectures on first aid and bandaging for the public.[37] These changes reflected the American attitude toward ‘first-aid’ by the end of the war- that the civilian had the agency and responsibility to become a competent first-aid responder.

Through evident changes in nursing education and ‘first-aid’ curriculum adoption, it is possible to examine the affects the evolution of the triangular bandage ultimately had on its own declining status as a medical technology.  While the bandage was increasingly manufactured to meet consumer demand in a new age of civilian ‘first-aid,’ a job that had once formally belonged to a nurse, underwent scrutiny.  Updates to the national curriculum in nursing by 1933 reflected a subtle but important decline in time dedicated to bandaging training.  Course time in elementary bandaging had been decreased from 10 to eight hours of instruction.[38]  In contrast, however, there was a five-hour increase in emergency nursing and first aid.  This increase in instructional training (following increased demand following the First World War) might have reflected increased emphasis on emergency bandaging techniques, had the triangular bandage maintained its status as a medical technology.  Rather, the opposite occurred.  Coursework objectives outlined Army and Red Cross nursing equipment training, training on wounds, fractures, and strains, with no explicit mention of bandaging.[39]  Red Cross texts listed as supplementary materials suggest a shift in American aid culture as the U.S. affiliate of the International Federation of the Red Cross and Red Crescent Societies was officially integrated into American nursing education.  The motivation of the American Red Cross in aid education was continuously transformed as America adopted ‘first-aid’ culture after the triangular bandage was made accessible.  Educating citizens was not only a public health measure, but also a market move in conjunction with consumer healthcare company Johnson & Johnson.  The triangular bandage and its manual in the ‘First-Aid Kit’ of the home, the factory, and the school empowered citizens and subsequently lowered its status as a medical technology in the hands of the educated, practiced, and masterfully skilled artists of bandaging: America’s nurses.

Gradually, references to the Esmarch bandage began to disappear from texts by the 1940s. Due in part to the emergence of newer technologies (some developed by Esmarch himself, including a rubber bandage), but largely determined by declining status as a medical technology, the complexity of the triangular bandage fell out of favor with clinical and civilian texts.  Many outside actors had established the importance of first-aid, namely, manufacturer Johnson & Johnson backed by the American Red Cross.  The ‘art of bandaging’ so masterfully executed by trained nurses was simplified and condensed to teach the civilian to adopt the new identity of the American ‘first-aider’: an obligated responder in emergency situations.  The relegation of the triangular bandage largely contributed to its disappearance as a prominent feature of nursing curriculum.  However, the change was successful in establishing a solid foundation of ‘first-aid’ for American citizens.  From a soldier’s pocket to the nurse’s field hospital to the hands of the American adolescent in ‘first-aid’ class, the triangular bandage was one of the most versatile medical technologies to ever reach the hands of the American citizen.

[1] Herrmann, E.K. “The Dying Art of Bandaging,” Western Journal of Nursing Research Vol. 14, No.6 (1992): 791.

[2] Ambulance Work and Nursing (Chicago: W. T. Keener & Co., 1899), 67.

[3] Committee on Education of the National League of Nursing Education, Standard Curriculum for Schools of Nursing. (Baltimore: Waverly Press,1917), 86.

[4] Albert S. Marrow, The Immediate Care of the Injured (Philadelphia and London: W.B. Saunders Company, 1906), 108.

[5] Little, Vincent J. “The Fabric of First Aid: A History of the Triangular Bandage.” Pharmacy History Australia: The Newsletter of the Australian Academy for the History of Pharmacy no.9 (1999): 10.

[6] Ibid

[7] Ambulance Work, 67.

[8] J.M. Grant M.D. “Professor Esmarch’s Triangular Bandage,” The Lancet (1874): 746.

[9] Johannes Friedreich von Esmarch, Samariterbriefe (Kiel: Verlag von Lipsius & Tischer, 1886), 29.

[10] Joel D. Howell, Technology in the Hospital: Transforming Patient Care in the

Early Twentieth Century (Baltimore: Johns Hopkins University Press, 1995), 8.

[11] Major Charles Lynch, American Red Cross Abridged Text-Book on First Aid: Women’s Addition; a Manual of Instruction (Philadelphia: P. Blakiston’s Son & Co., 1913), 10; Alvah H. Doty, M.D., A Manual of Instruction in Principles of Prompt Aid to the Injured: Designed for Military and Civil Use (New York: D. Appleton and Company, 1890), 77.

[12] Henry R. Wharton, M.D., Minor Surgery and Bandaging (Philadelphia and New York: Lea Brothers & Co., 1902), 17.

[13] Albert S. Marrow, The Immediate Care of the Injured, 134.

[14] Ambulance Work, 68.

[15] Ibid

[16] Henry C. Leonard, A Manual of Bandaging: Adapted for Self-Instruction. (Detroit: Daily Post Book Printing Establishment, 1876).

[17] Lynch, Charles, American Red Cross Abridged, 2.

[18] Committee on Education. Standard Curriculum for Schools of Nursing, 85.

[19] Committee on Education. Standard Curriculum for Schools of Nursing, 117.

[20] W.G. Stimpson. Prevention of Disease and Care of the Sick, (Washington: Government Printing Office, 1917), 224.

[21] Ibid;General Index to Red Cross Notes, (New Brunswick: Johnson & Johnson, 1900), 18.

[22] General Index to Red Cross Notes, 104.

[23] A.D. Whiting, Bandaging, (Philadelphia and London: W. B. Saunders Company, 1915), 131.

[24] Johnson’s First Aid Manual, (Baltimore: Johnson & Johnson, 1917), 59.

[25] Johnson’s First Aid Manual, 69-75.

[26] Johnson’s First Aid Manual, 69.

[27] A.D. Whiting, Preface to Bandaging, 7.

[28] A.D. Whiting, Bandaging, 132-143.

[29] Johnson’s First Aid Manual, 3.

[30] Ambulence Work, 68.

[31] Ambulence Work, 69.

[32] Charles Lynch, American Red Cross Abridged, 1.

[33] Dominique Tobbell, “Nursing In the Early 20th Century.” (Lecture presented, Minneapolis, Minnesota, October 05, 2015).

[34] A.D. Whiting, Bandaging, 10; Eldridge L. Eliason, in Contents of First Aid in Emergencies, (Philadelphia and London: J.B. Lincott Company, 1915), v.

[35] Lois Oakes. Illustrations of Bandaging and First-Aid (Baltimore: The Williams and Wilkins Company, 1942).

[36] “Book Reviews” American Journal of Nursing. Vol 41, No. 1 (1941): 130.

[37] “First Aid Advice in Red Cross Lecture,” Washington Post, March 22nd, 1917, 4.

[38] Committee on Education of the National League of Nursing Education, A Curriculum for Schools of Nursing, (New York: National League of Nursing Education, 1932), 108.

[39] Committee on Education , A Curriculum, 155.

 

 

Electrocardiogram and Diphtheria in the early 20th Century

The following post is a paper written by Matthew Cohen, a senior majoring in Biology, Society, and Environment at the University of Minnesota, in the spring semester of 2015 for Dominique Tobbell’s class HMED 3075.  In a recently published pair of articles in the Bulletin of the History of Medicine, Dominique Tobbell and Lois Hendrickson described  their use of historical artifacts (from the Wangensteen Historical Library) in their history of medicine courses.  Mr. Cohen’s paper is the first of three papers offered as examples of the work students have done in their classes.

The electrocardiogram’s ability to observe the conduction of the heart, changed medical practice for diphtheria in the early 20th century. My paper will argue that the implementation of the electrocardiogram by physicians changed the diagnosis, monitoring and treatment of diphtheria infected patients in American from 1900-1938. I will be using the electrocardiogram from the Wangensteen historical library collection. This paper will discuss how the electrocardiogram functions, and the implications and clinical relevance of using the electrocardiogram to diagnose the infectious disease diphtheria.

ecg2

To understand the impact and effects of the electrocardiogram, it must first be defined as a medical technology. The electrocardiogram (ECG) is a medical technology that is used to record the electrical signals of the heart. According to Joel Howell’s definition of technology, the ECG is medical technology by virtue of being a physical object, the activity it performs, and the technological know-how needed to use it.[1] The ECG is relatively easy to use and produces an accurate recording of the heart’s electrical signals however, the scientific know-how to understand the results is what makes the ECG a medical technology. The information recorded by the ECG must be interpreted by a person skilled to equate the physical tracing to the physiological functions of the heart. This means that an anatomical and physiological understanding of the heart is required to accurately understand the results.

The heart has two main components that must be considered, the physical heart and its electrical system. The heart is a muscle, comprised of specialized cardiac cells which, under the influence of an electrical current, contracts to force blood throughout the body.[2] The heart has four anatomically distinct regions. The upper portion of the heart is divided into two openings called the atria and the lower portion of the heart is divided into two openings called the ventricles. These openings are filled with blood, which is expelled from the heart to the body during systolic contracting. This is the physical pumping system of the heart, which must work in conjunction with the electrical system.

The coordinated contraction of the atria and ventricles is controlled by the heart’s electrical system. The ECG records this electrical current, which is a representation of the heart’s physical functioning. The electrical signal passes through a well-established pathway inside of the heart. Cardiac tissue has unique properties, which allow it to generate its own electrical impulses. These electrical impulses, under normal cardiac function, originate in a specific cluster of cardiac cells, called the Sinoatrial (SA) node. The electrical impulses then travel via the intra atrial pathway to the Atrioventricular (AV) node.[3] This results in the contraction of the left and right atria. The electrical impulse then travels through the bundle of His and down the purkinje fibers, resulting in the contraction of the left and right ventricle.[4] This is the normal pathway of the heart’s electrical activity, called a normal sinus rhythm. Since all cardiac cells have the potential to generate an electrical impulse, abnormal conductive pathways can also occur.

The invention of the ECG is credited to Willem Einthoven in 1901.[5] Einthoven developed the standardized ECG pattern, which displayed the electrical pathway discussed previously. Einthoven’s work led him to develop the 3 limb lead placement, now known as precordial leads. The 3 limb lead placement creates an equilateral triangle which is used to capture the hearts electrical activity and is still used today.[6]

The essentials of the electrocardiogram have remained largely unchanged since its development. Einthoven’s early ECG was enormous- weighing over 600 pounds, required two rooms and five people to operate.[7] Einthoven’s ECG measured fluctuation in electrical current in a wire that was suspended between electromagnetics.[8] The results were recorded onto a photographic plate. The electrocardiogram in the Wangensteen historical library collection was made by the Cambridge Instrument Company and it was given the name “Simpli-Scribe Portable Model”.[9] This ECG was designed to be portable, with an exterior made of wood and a metal carrying handle.[10] This version of the ECG did not have a date of manufacture but Cambridge produced the Simpli Scribe from 1945-1960.[11] The Simpli Scribe is approximately one foot cubed in size, and is powered by a two prong electrical cord. The five leads on the ECG are attached to the patient via a strap placed on the chest. The resulting electrical fluctuations are recorded by an electrically heated metal needle that scorches the reading into the paper.[12] While the Simpli-Scribe, is smaller and more refined than the earliest electrocardiogram’s designed by Einthoven, the basics still remained unchanged. Both required electromagnets, recorded fluctuation in electrical activity, and used wires attached to a patient at the same locations.

The use of the ECG in recording cardiac activity was well known by the mid 1910s. The electrocardiogram was hailed as an unquestionable way to determine if one was sick or well.[13] The clinical application of the ECG for monitoring cardiac rhythms such as auricle fibrillation and extra systole beats were well documented.[14] In 1916, the British army adapted the electrocardiogram as a means to screen army recruits, in conjunction with X-rays, to determine their cardiac health and fitness to serve.[15] By this time period, the ECG was already recognized for its ability to determine cardiac health. The electrocardiogram was so prevalent, that by 1924, Willem Einthoven was awarded the Nobel Prize for Physiology and Medicine for his discovery.[16] While the electrocardiogram was well known for its ability to record the heart’s electrical activity, its use in medicine during the time was limited to routine physical exams.  However, the importance of the ECG changed when the electrocardiogram was used to monitor and diagnose cardiac issues in infectious diseases such as diphtheria.

Diphtheria is an infectious disease caused by the bacteria Corynebacterium diphtheriae that is spread from person to person through respiratory droplets.[17] The disease targets the mucus membranes of the mouth and nose where they can travel to the respiratory system.  Diphtheria will produce symptoms similar to a common cold, such as weakness, sore throat and low grade fever.[18] A thick coating will build up in the mouth and nose, making it difficult to breathe. While these respiratory symptoms cause difficulty breathing, the majority of deaths associated with diphtheria are the result of the toxin produced by the bacteria.[19] The diphtheria toxin (DT) can travel into the blood stream, where it will damage internal organs such as the heart, lungs and kidneys.[20] The damage caused to the heart is called myocarditis. Myocarditis is an inflammation of the heart tissue, which results in mortality as high as fifty percent if left untreated in patients.[21]

Diphtheria was a widespread infectious disease in the United States in the early 20th century. In 1921 there were 206,000 confirmed cases of diphtheria which resulted in 15,520 deaths.[22] The population of the United States in 1921 was 108,538,000 according to the United States Census Bureau.[23]  This means that 0.18% of the U.S. population in 1921 was infected with diphtheria. There was still a strong prevalence of diphtheria in the U.S., even after the introduction of a vaccine in 1923.[24] The vaccine provided an effective means of prevention but treatments for the afflicted remained the same. The strong presence of diphtheria is shown by specialized hospital wards dedicated to diphtheria patients, fifteen years after the vaccine was introduced.[25]

By the mid to late 1920s the effects of the diphtheria toxin on the neuro-musculature and fatty degeneration of the heart were well known, and understood by physicians to be longer lasting than previously thought. Jenner Hoskin, a doctor at the Philadelphia Hospital conducted a study on the effects of diphtheria on the heart in 1926. Hoskin discovered that upon admission to the hospital, 72% of patients had no cardiac complication with the exception of tachycardia, but several days later 28% of patients were found to have abnormal pulses which degenerated into grave heart trouble.[26] While it was not clear what Hoskin defined as “grave heart trouble” the implications were clear; diphtheria had deadly effects on the heart. Hoskin noted that virtually all patients admitted for diphtheria initially had tachycardia that persisted for 36-48 hours after initial treatment was given.[27] The continued presence of tachycardia following the first 36-48 hours was an indication of myocarditis, a result of the diphtheria toxin damaging the heart.[28] Hoskin’s use of persistent tachycardia over the treatment interval was common practice at the time. Hoskin argues for the importance of electrocardiograms in identifying myocarditis secondary to diphtheria and monitoring the patient’s heart even after their hospital stay.[29] Close monitoring of the heart can result in early recognition of cardiac complications and reduce mortality rates. Physicians in the early 20th century recognized that death as a result of diphtheria typically occurred on the 10th day of infection from acute myocarditis, or on the 3rd week from fatty degeneration of the heart muscle.[30] This highlights the importance of having an effective way to monitor the cardiac function of the heart, which the electrocardiogram does effectively.

Prior to the wide spread use of the ECG in monitoring heart function, a physician would assess heart health through auscultation of the heart, palpation of pulses, physical appearance of the patient and subject statements from patient.[31] Auscultation of cardiac sounds using the stethoscope was the preferred method.[32] The physician would listen to cardiac sounds at specific points on the chest to determine the health of the patient’s cardiac valves. The strength of the heart would also be assessed by feeling the pulse, typically at the wrist, to determine the regularity and strength of the heart.[33] The physical appearance of the patient was noted, pallor or a pale appearance was considered a sign of poor perfusion and attributed to cardiac insufficiency.[34] These assessments were subjective and depended on the knowledge of the individual physician to be able to identify these abnormalities.

Even in patients that survived the disease, lasting cardiac abnormalities were found. Cardiac abnormalities such as premature beats, diminished cardiac reserve, heart enlargement and tachycardia were observed. Hoskin stated that the lasting effects of diphtheria on the heart can be seen in the “slurring of the Q.R.S. complex” visible on ECG.[35] The Q.R.S. complex represents the depolarization of the ventricles and the corresponding contraction. Cardiac abnormalities that resulted from the infection would need to be monitored over the patient’s life to ensure detection of worsening cardiac issues. Hoskin stated that it is of the “utmost importance that patients in whom cardiac symptoms and abnormalities have been discovered be thoroughly examined both clinically and electrocardiographically” and that these examinations must continue until it is resolved or reaches a stable condition.[36] The ECG can be utilized to monitor and diagnose diseases that have an impact on the heart but are not cardiac in nature, such as diphtheria. Diphtheria may have no effect on the electrical conduction system of the heart, but it will result in physical changes to the heart tissue which can be seen on the ECG tracing, such as Q.R.S. complex slurring.

The shift in prevalence of the ECG in cardiac monitoring can be seen in a 1938 article by Mason Leete in which he states it is “presumptuous to enter such a field [diphtheria] armed only with fingers, stethoscope and sphygmomanometer.”[37] This displays an attitude shift during the late 1930s in which the subjective information gathered by the physician decreased in importance and the objective information created by medical technology such as the ECG increased in significance.  The importance of the ECG in heart health was noted but traditional measures of obtaining pulses, heart sounds and more recently blood pressures were still valued.[38] The combined use of subjective information obtained by the physician and unbiased technologically obtained data, such as from the electrocardiogram and sphygmomanometer, are valued as the most effective way to monitor patient’s disease progress. A careful comparison of clinical and electrocardiographic findings are required to thoroughly assess the patient’s health and provide an accurate prognosis.[39]

In treating diphtheria and monitoring the progress of infected patients, the use of the electrocardiogram became the standard. The importance of the ECG in diphtheria patients is emphasized in Harries’s 1932 article titled “Auxiliary treatment of toxic Diphtheria, when he states that “obtaining repeat electrocardiograms from the cases under treatment . . . provid[es] the most valuable evidence as to progress.”[40] This statement displays the value placed on ECG’s in diphtheria patients as the most valuable tool to monitor progress, even over the physical observations of the physician. This shift in preference is mirrored in a 1935 article on electrocardiogram in diphtheria in which the author states that “although there is much agreement between electrocardiographic and clinical signs the former usually proceeds the later and change in the curves may demonstrate lesions which cannot be discovered by other means.”[41] This is an important shift from years prior when the ECG was seen as the primary tool for not only monitoring patients but also diagnosing previously unseen or unobservable lesions of infection. The instrument is sensitive enough and its accuracy is trusted to the point that the information displayed by the ECG is more accurate than what the physician can observe. The importance of the ECG was shown in an 1937 article by Norman Begg in which he states that “Survival depends upon the amount and distribution of undamaged or lesser damaged myocardial tissue,” the severity of which could be determined by the ECG.[42] The value of the ECG to monitor the patient’s progress and prognosis was well known by this time period, but the ECG also influenced treatment of diphtheria.

The medicine used to treat diphtheria remained relatively unchanged from 1899 to the late 1930s. Early treatment of diphtheria consisted of the use of antitoxin for the diphtheria bacteria, although the dosages varied significantly based on the preference of the physician providing care.[43] Additionally, the use of mercurial was advocated for, which was mercury in a solution, generally in a topical form placed on to the lesions of infection and oxygen therapy as needed to treat difficulty breathing.[44] There is no mention of using electrocardiograms to monitor cardiac health. This picture changed dramatically in the mid 1920s with the prevalence of the ECG. While many of the medicines used to treat diphtheria remained unchanged, such as the use of antitoxin, the use of digitalis became prevalent to control the tachycardia that resulted from the diphtheria toxin.[45] This led to the prevalence of the ECG as a tool to determine when certain medicines are indicated to treat the patient.

The electrocardiogram was also important in its influence on treatments given to diphtheria patients. The ECG was used to determine the extent of cardiac toxicity from the diphtheria infected patients and excluded the use of certain drugs, that in milder cases of diphtheria would have been indicated. Digitalis is an example of when an ECG would be used to rule out the use of digitalis for that patient due to cardiac damage.[46] The drug digitalis is well known for its effects on decreasing heart rate, increasing strength of contractions, and increasing blood pressure.[47] Digitalis was generally indicated for use to treat tachycardia in moderate to severe diphtheria infections, determined by electrocardiograph, was contraindicated due to its increased cardiac demand.[48] The ECG was used to not only determine abnormal conduction of the heart but also as a clinical tool to determine if a medicine such as digitalis is safe for the patient.

The ECG was widely used in monitoring diphtheria patients and had a strong influence on treatments rendered. The cases of diphtheria rapidly decreased in the late 1940s with the widespread use of the diphtheria vaccine, even though the vaccine was developed in 1923.[49]  After the 1940s, the prevalence of diphtheria dropped sharply from approximately 19,000 in 1945 to 1 in 1998.[50] The diphtheria vaccine resulted in the eradication of new cases of the disease in the United States. The treatment for those infected, with the addition antibiotics such as penicillin in the late 1940s, remains the same.[51]

In conclusion, the electrocardiogram had widespread use in the medical field as a tool for assessing cardiac function. The utilization of the electrocardiogram in monitoring cardiac condition had profound impacts on the diagnosis and treatment of diphtheria during the early 20th century.  The ECG became the standard for assessing the severity of infected patients, monitoring their progress, determining patients’ prognosis and which medicines could safely be used. The success of the ECG in diphtheria infected patients redefined the ECG as a medical technology for diagnosing infectious diseases as well as cardiac diseases. The ECG is currently used to monitor a wide variety of infectious diseases including HIV, rubella, typhoid, rheumatic fever and diphtheria.[52] The electrocardiogram changed medical practices in diphtheria and is still actively used in infectious disease care today.

Bibliography

Primary Sources:

Andersen. “The electrocardiogram in diphtheria” The Lancet (1935) 689.

Begg, Norman “Diphtheritic Myocarditis, an electrocardiographic study” The Lancet (1937).

Harries, E. “Auxiliary treatment of toxic diphtheria” The Lancet (1932).

“Heart Diagnosis in British Army.” New York Times (1916).

Hoskin, Jenner. “The after effects of diphtheria on the heart.” The Lancet (1926), 1141-1143.

Leete, Mason. “The Heart in Diphtheria.” The Lancet (1938).

McClanahan, H. “Treatment of Diphtheria.” The Lancet (1899).

Simpli-Scribe EKG, Wangensteen historical library collection. University of Minnesota- Twin Cities.

“Sure way to finding out how sick or well you are by measuring electricity in your body” The Washington Post (1914).

Williams, James. “Electrocardiogram in Clinical Medicine.” American Journal of the Medical Sciences (1910) 644-668.

Williamson, Bruce. “The rational use of Digitalis.” The Lancet (1928).

Secondary Sources:

Anatomy of the Heart National Heart, Lung and Blood Institute, National Institute of Health (November 2011), accessed November 22, 2015.

Barold, S.S. “Willem Einthoven and the birth of clinical electrocardiography a hundred years ago.” Cardiac Electrophysiology Review (2003): 99-104.

“Definition of Penicillin History” MedicineNet (2012).

Diphtheria Centers for Disease Control (2013).

“Historical National Population Estimates.” United States Census Bureau (2000), accessed November 22, 2015.

Howell, Joel, D. Technology in the Hospital, Transforming patient care in the early twentieth century (Baltimore: Johns Hopkins University Press, 1995), 9

Nalmas, Sandhya. “ Electrocardiographic Changes in  Infectious Diseases” Hospital Physician (2007), 3.

“Simpli-Scribe” Edward Hand Medical Heritage Foundation, accessed November 22, 2015.

Vaccine, Diphtheria Center for Disease Control, accessed November 22, 2015.

 

Endnotes

[1] “Anatomy of the Heart” National Heart, Lung and Blood Institute National Institute of Health (November 2011) accessed November 22, 2015.

[2] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[3] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[4] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[5] S.S. Barold, “Willem Einthoven and the birth of clinical electrocardiography a hundred years ago.” Cardiac Electrophysiology Review (2003), 99-104.

[6] “Anatomy of the Heart” National Heart, Lung and Blood Institute

[7]  “Simpli-Scribe.” Edward Hand Medical Heritage Foundation, accessed November 22, 2015.

[8] “Simpli-Scribe.” Edward Hand Medical Heritage Foundation

[9]  Simpli-Scribe EKG, Wangensteen historical library collection. University of Minnesota- Twin Cities.

[10] Simpli-Scribe EKG, Wangensteen historical library collection.

[11] “Simpli-Scribe.” Edward Hand Medical Heritage Foundation

[12] “Simpli-Scribe.” Edward Hand Medical Heritage Foundation

[13] “Sure way to finding out how sick or well you are by measuring electricity in your body” The Washington Post (1914)

[14] James Williams, “Electrocardiogram in Clinical Medicine.” American Journal of the Medical Sciences (1910), 644-668.

[15] “Heart diagnosis in British army.” New York Times (1916), 21.

[16] Barold, “Willem Einthoven”, 99-104.

[17] Diphtheria Centers for Disease Control (2013) accessed November 22, 2015.

[18] Diphtheria

[19] Diphtheria

[20] Diphtheria

[21] Diphtheria

[22] Diphtheria

[23] “Historical National Population Estimates.” United States Census Bureau (2000), accessed November 22, 2015.

[24] Diphtheria

[25] Diphtheria

[26] Jenner Hoskin, “The after effects of diphtheria on the heart.” The Lancet (1926), 1141.

[27] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[28] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[29] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[30] Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[31]  Mason Leete, “The Heart in Diphtheria.” The Lancet (1938)

[32]  Leete, “The Heart in Diphtheria.”

[33]  Leete, “The Heart in Diphtheria.”

[34]  Leete, “The Heart in Diphtheria.”

[35]  Hoskin, “The after effects of diphtheria on the heart.”, 1142.

[36] Hoskin, “The after effects of diphtheria on the heart.”, 1143.

[37] “The Heart in Diphtheria.”

[38] “The Heart in Diphtheria.”

[39] “The Heart in Diphtheria.”

[40] E. Harries, “Auxiliary treatment of toxic diphtheria” The Lancet (1932).

[41] Andersen “The electrocardiogram in diphtheria” The Lancet (1935), 689.

[42]Norman, Begg. “Diphtheritic Myocarditis, an electrocardiographic study” The Lancet (1937).

[43] H. McClanahan, “Treatment of Diphtheria.” The Lancet (1899).

[44] McClanahan, “Treatment of Diphtheria.”

[45] Bruce Williamson, “The rational use of Digitalis.” The Lancet (1928).

[46] Williamson, “The rational use of Digitalis.”

[47] Williamson, “The rational use of Digitalis.”

[48]  “The Heart in Diphtheria.”

[49] Vaccines Center for Disease Control, accessed November 22, 2015.

[50] Vaccines

[51] “Definition of Penicillin History” MedicineNet (2012), accessed November 22, 2015.

[52] Sandhya Nalmas, “ Electrocardiographic Changes in  Infectious Diseases” Hospital Physician (2007), 3.

UPDATE on Teaching History of Medicine with Artifacts and Oral Histories

You can now access Teaching Medical History with Primary Sources: Introduction by Dominique A. Tobbell and Teaching with Artifacts and Special Collections by Lois Hendrickson on Project Muse, and read the full articles in the Bulletin of the History of Medicine, Volume 90, Number 1, Spring 2016.  The syllabi for classes mentioned in these essays can be found in the BHM syllabus archive.

index

Undergraduate students in the Wangensteen Historical Library’s reading room looking at a table with artifacts.

 

Teaching History of Medicine with Artifacts and Oral Histories

The next issue of the Bulletin of the History of Medicine contains a new section on pedagogy.  This inaugural section contains two articles, one on teaching medical history with artifacts (by Dominique Tobbell) and the other on teaching medical history using oral histories (by Lois Hendrickson).  In addition to the print articles, Professors Tobbell and Hendrickson are publishing on this blog three student papers that illustrate more concretely the kinds of work students produce in their courses.  These papers will be published as separate blog posts in the first week of April.

What follows is a preview of the articles you can read in the Bulletin later this week, as well as the student papers that will be posted on this blog.

sphyg2

Sphygmomanometer from Wangenstein Historical Library, University of Minnesota.

Students are fascinated with the opportunity to touch history – to use artifacts, skim historical newspapers and archival documents, and listen to first-hand accounts of lived history as presented in oral history interviews.  They glean skills and significantly different perspectives from interacting directly with these primary sources. In their article discussing pedagogical approaches to teaching with archival documents, artifacts and oral history, Dominique Tobbell and Lois Hendrickson outline their methodology and student outcomes in two undergraduate courses in which they collaborated. In the first course, students examine a historical medical artifact from the Wangensteen Historical Library, and situate its impact or change on health care practice, health care institutions, patients, consumers, and health care policy. The second course asks students to use oral histories to understand and reflect on roles that women have played as healers, patients, research subjects, and health activists in U.S. society. Their companion essays examine how they bring resources into the classroom and develop exercises enabling students to work with and analyze various artifacts, texts, and manuscripts, and oral histories.

Three student research papers from the Technology and Medicine in Modern America course provide examples of how these students used historical artifacts and integrated course reading, lectures, and original research, to write about how technology came to medicine’s center-stage, and the impact that medical technologies had on medical practice, medical institutions, and medical consumers. The first essay by Haylie Helms, discusses the impact sphygmomanometers had on clinical diagnosis, its introduction to practice, and the evolution of the user that came with an increase in demand for sphygmomanometery. In a second essay, Maria Nulls details the demise of a technology, the Triangular Bandage. She follows its transformation from a technology used by trained medical practitioners, to a simplified and condensed consumer product, meant for civilian use as seen in Johnson & Johnson’s ‘First-Aid’ Kits.  The final essay, by Matthew Cohen, argues that the implementation of electrocardiograms (ECG) changed the diagnosis, monitoring and treatment of diphtheria. Cohen’s paper attributes this to a shift in the increased significance of objective information created by medical technological, and details how this medical technology redefined diagnosis of infectious diseases as well as cardiac diseases.

ecg2

ECG from Wangenstein Historical Library, University of Minnesota.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 38 other followers

Follow me on Twitter