• Skip to main content
  • Skip to main content
Choose which site to search.
University of Arkansas for Medical Sciences Logo University of Arkansas for Medical Sciences
Medicine and Meaning
  • UAMS Health
  • Jobs
  • Giving
  • About Us
    • Submission Guidelines
  • Issues
  • Fiction
  • Non-fiction
  • Poetry
  • Conversations
  • Images
  • 55-Word Stories
  • History of Medicine
  1. University of Arkansas for Medical Sciences
  2. Medicine and Meaning
  3. Non-fiction

Non-fiction

Puffy Girl Problems

By Morgan Sweere Treece

Lights, bright lights, blurry lights, headlights, flashing ambulance lights, EMT flashlights, fluorescent hospital lights.  That’s probably some of the only things I actually can recall about that night. 

November 15, 2012.  It was a chilly- Thursday evening, the time of year when all the leaves have just fallen on the straw-like, browned grass as everyone got ready to pull out their fuzzy, woolen scarves.  It seemed no different from any other day.  I got up, went to school, and everything was normal, but when I got home, I heard some of my favorite words leave my mom’s mouth:  “Your cousins from Fayetteville are in town and headed out to the field.”

My cousins were on break from college and home for the week, which only happened once in a blue moon, so it was breaking news to me.  What did this rare occurrence mean?  Extreme night mudding!!!  In mere seconds, I was hopped up in my car and sped on my way to the field where we took the four-wheelers each time the cousins visited town.  It could actually be compared more so to an obstacle course instead of a field, the way the trees seemed to randomly spurt from the rocky soil, with large boulders and hills lining the moist, unmowed fields.   We unloaded the four-wheelers, still caked with the old, dried mud and dust from our last night riding event.  Soon enough, the four of us were zipping over the rough and ragged barriers toward our designated finish line, the dry, brown grass grazing our ankles and bitter wind gnawing at our cheeks.

One of the next memories I have of that momentous evening is waking up, confused, hardly able to see, completely unable to think, having been rolled over by multiple unidentified hands.  In what seemed like hours but was actually seconds after that, I tried to understand what was happening.  I recognized no one around me.  My mind was unable to comprehend the situation.  I heard shouting – in the back of my mind, a perceived distant memory.  “HELP! She’s gushing blood! Morgan!”

EMTs, paramedics, cousins, and my mom all screeched in unnerving anxiety.  They attempted to relieve the aching pain from my marred face and body soaked with blood.  I couldn’t feel the pain, but I knew it was there somehow.  I couldn’t talk.  I couldn’t breathe.  I couldn’t think.  I had a major migraine, and I could feel the blood gushing from every pore of my body. The blood started to dry and left an amber stain all over due to the icy, winter-like wind.  I had no idea what was happening and no one took the time to explain it to me.  I could see the wrathful, crimson liquid encompass my line of vision in mere seconds, and I even tasted the iron soaking onto my tongue.  Then all went black.  The next  14 hours of reconstructive surgery and stabilization were marked by panic, worry, and God’s miraculous and loving work in the hands of many talented ER physicians and surgeons.  The surgery was tedious but flawlessly executed in the end, soothing news to my anxiety-filled family in the waiting room. Then I woke up in a haze.  The intense, sterile hospital room filled with dazzling yet blinding fluorescent lights and vaguely familiar faces left me still unable to comprehend just where I was, or even who I was.  The copious amounts of drugs in my body overtook my mind, and the five or so hours after I awoke are mostly a fuzzy cloud in my memory. 

Several days later,  I regained complete alertness in a sober stage.  Someone told me that in my first spoken words after I roused from my morphine-induced slumber of surgery,  I solicited the immediate presence of my mother, my boyfriend, and (of course) a Sonic Coke — clearly, my priorities were pretty straight, even as a pain-filled, drugged-up girl.  I knew exactly what I wanted and what I would have missed greatly had I never seen those three things again, however trivial the last is.

When I woke up in the hospital, the first things I actually remember were large dark black and blurry mounds blocking my view. I blamed this “tunnel vision” on my cheeks, which had risen to an insanely abnormal height due to the entire swelling of my face.  However, now that I think about it, symbolically this block in my sight may have actually been my seeing less temporarily so that I could see more now. It took a full six months – in what I called “puffy girl problems” — for all the inflammation to disappear, but after I finished being the metal chipmunk I was and began to regain the feeling in my face, I realized how important the experience was to me, however awful it may have been.  And trust me, it was really awful. 

The best and worst experiences in the lives of people always are the most important memories because those experiences have the greatest impact on the way people live.  I consider my ordeal as one of the most painful and grueling experiences I have had at this point in life in which I utterly shattered every possible bone in my face into 23  pieces, including both jaws. I had a concussion and practically became  The Terminator with four metal plates, four metal screws, and large wires and bands that starved me for six weeks.  However, I also realized it was one of the greatest blessings in my life because I realized the most important things in life: God, family, and friendships. 

Even almost a year later, I often get caught up in the almost regular conversations imbued by questions like  “You broke your…face?” “Did it hurt?” “Can I touch it?” “You have metal in there?” “What happens when you go through airport security?  Or if you get slapped?”, and the most frequently asked “Do you look different?” or “Do you like it?”.  Those last words of blunt inquiry always seem to strike me the most, although it is only a frivolous query to others.  The words seem to stop me dead in my tracks more than anything anyone could probably ever say to me.  I’m not different.  I’m still me. I still feel like me.  But I’m not.  I didn’t recognize myself in the mirror for almost three months after my surgery because, in truth, I am different—on the outside and the inside.  Not only did my (still groundbreaking) stupidity and recklessness teach me not to attempt ramping a rocky slope at top speed on a four-wheeler in the autumn dusk just because I was dared to. It taught me that I — as much as I want to be or think  I feel most of the time — am not Superman.  I shouldn’t live like it either.  I can’t live like I’m invincible, like I can’t get hurt or suffer any consequences,  like I’ll never get caught doing anything bad, like I can save everyone around me from everything.  It might be an insane joyride in the air to feel so immortal, flying even, but the ending sensation of cracks, snaps, stinging, and wearing a warm, sticky sweater of blood sure did crack my thick skull as I hit the ground. I learned from all that.

However cliché it may be to say that the lesson learned here is carpe diem or “YOLO,” it isn’t to say the latter — that life isn’t forever and high school isn’t forever, and I learned to no longer treat  those periods  in such a way.Waking up in that uncomfortable hospital bed, drugged to the maximum, looking like a hideous, Botox-gone-wrong patient, but still surrounded by tons of worried, sympathetic faces of loved ones, along with various flowers, cards, and balloons — I couldn’t help but feel more like a puffy little princess than the monstrosity of a monster I was then portraying.  It’s a medical miracle that doctors were able to piece back together a face shattered into 23 pieces, much less to make the end result even remotely close to my previous appearance. I learned to appreciate the precious jewel of life and the irreplaceable emeralds and rubies surrounding my bed 24/7 until I was released to go home.   The sudden realization I had when awaking in that bed was not the normal “blinding light” that near-death experiences often produce, but more a recognition that I should live my life better, treat those I love better, and graciously accept the undeniable truth that I hadn’t been doing these things before.  I often ponder, what if I hadn’t woken up?  Had I told my family I loved them before I left?  Did my friends know how much I cared about them?  Was there any doubt that I lived my life for God?  Was my time here enough?  I never wanted to question it again.  I was never afraid of dying — just, not really living, and not living the right way.  I truly believe that God has given me another chance in this life; it just took a quick slap in the face (or rather face plant) to realize how I’m supposed to be living my life.  Don’t wait to get your life together or to go for your goals — the time is now.  My accident was a traumatic eye-opener, although it seemed quite the opposite through the six months of swelling.  The genuine truth about “forever” is that it is happening in the present moment.  Living life to the fullest is an understatement for me; I want to live life to the point of super saturation, to the point that I am constantly overflowing with unparalleled joy, kindness, and gratitude.  It is vitally important to live each day as if it is my last, to tell people I love them while I have the chance, and to never leave any words unspoken.  The lesson learned?  Don’t put off things until tomorrow.  What if there is no tomorrow?  Live your dreams now.  Say it.  Do it.  Appreciate it.  Love it.   And whatever you do:  Always, always, ALWAYS wear a helmet. 

Morgan Sweere Treece is an M.D./Ph.D. student at UAMS.

Filed Under: all, Non-fiction

George Macready and the Art of Family Medicine Publications

By Diane Jarrett

The doctor stood there looking at his patient with a mixture of compassion and horror. Somehow the treatment had gone terribly, terribly wrong. What had started promisingly as innovative therapy for serious trauma wounds had resulted in the ultimate unexpected side effect.

The patient had been transformed into an alligator.

Perhaps this is not surprising when you consider that the doctor had been possessed by the devil only 15 years earlier. Oh wait — that turned out to be just a bad dream.

In a way, all of the above is a dream, from the standpoint that I’m talking about movies and movie doctors as portrayed by George Macready (1899-1973), a stage, film, and TV actor whose career spanned from the 1920s to the 1970s. Along the way, he was a devil-possessed doctor in The Soul of a Monster (1944) and an awesomely inept endocrinologist in The Alligator People (1959). And you thought you had personal problems, or had treated patients with scaly skin.

What has this to do with Family Medicine, you ask?

My department chair said basically the same thing when I told him that I had just obtained my first national publication, not in Family Medicine or JAMA or any of the journals that classic film fans would call the “usual suspects.” No, my debut was in Films of the Golden Age, and my topic was the life and career of George Macready. Macready is best remembered for playing Rita Hayworth’s husband in Gilda (1946), director Stanley Kubrick’s cruel World War I general in Paths of Glory (1957), and the patriarch Martin Peyton in TV’s Peyton Place from 1965 to 1968.

My Macready publication, which in part addressed his roles as physicians in The Soul of a Monster and The Alligator People, did nothing for my career advancement. Publishing essays about the lives of classic movie actors doesn’t count in Family Medicine for promotion, tenure, or anything like that. Zilch. No matter how often I’m published in film history journals (eight times so far) or how much acclaim I receive in film circles, the medical profession is unimpressed.

I’m not a medical doctor. I don’t even play one on TV. My background is in education and journalism, and I came to work in a Family Medicine residency program long after I diagnosed myself as having a serious case of classic film passion. Writing about clinical topics or education related to clinical topics had never been on my radar. It is now, but my heart continues to compel me to spend some of those hours that might be assigned to Annals of Family Medicine on researching the lives of actors such as William Boyd, Richard Todd, and others.

Being a published film historian is not the same as being published in a medical journal, of course. Still, I can’t help but wish that my avocation could be considered as support for my vocation. Thus far it seems unlikely, though I point out George Macready’s medical “connections” at every possibility.

Since promotion is important to me, I’ll continue to faithfully seek opportunities to publish in periodicals that count, kind of like the group called Dr. Hook & the Medicine Show, in their song from the 1970s, sought to be on the cover of Rolling Stone magazine. (That wouldn’t have counted for them either, despite having the word “medicine” in the name of their band.) In my personal time, though, I’ll continue to research and write about actors who might have at least played doctors in the movies.

Meanwhile, I can only hope that someone comes up with a peer-reviewed journal entitled Medicine in the Movies.

Diane Jarrett, Ed.D., is the Director of Education and Communications in the Department of Family and Preventive Medicine. She is also the Assistant Director of the residency program. 

Filed Under: all, Non-fiction

The Shoes Have Eyes

By Erin Yancey

“I like your shoes!” I said to the teenage girl standing against the wall of the elevator as I stepped in.  I had just begun my third-year psychiatry rotation, and I was arriving for my first day of clinic with a child and adolescent psychiatrist, Dr. Wilson.  I was particularly interested in both psychiatry and pediatrics, so I had been looking forward to this day for a while.  The girl ignored my compliment and continued to stare down at her bright pink Converse sneakers, complete with multicolored laces and a hand-drawn eye on each shoe.  Her mother standing next to her asked, “What do you say? Can you tell her thank you?”  The girl continued staring down, and her mother smiled a soft apology towards me.  The elevator arrived at our floor, and I tried to smile at the girl one last time to no avail.  Her gaze was fixed upon the sharpie-eyes on her shoes as if in a staring contest.  

Once in Dr. Wilson’s office, I watched him speak with patients with a variety of needs.  Although I had performed psychiatric evaluations with adults on my own, he suggested I shadow him for the first few patients of the day to see how interviewing children differed.  The first several patients were being seen for follow-up for their anxiety, depression, and ADHD.  I noted the close patient-physician relationship—all patients and their families spoke with Dr. Wilson comfortably and honestly, and it was clear they saw him as someone they could deeply trust.  Dr. Wilson quickly briefed me on the next patient to be seen as he had done all morning.  Her name was Sarah, and he explained that she rarely, if ever, spoke during her visits and had a history of severe depression.  She had a difficult past and lived with her adoptive parents.  Earlier in the week she attempted suicide by wrapping a shoe lace around her neck, the subject of her visit today.  The nurse brought Sarah in, and I immediately recognized the bright pink sneakers from the elevator.  She sat down in the chair across from the doctor’s desk and planted her heels firmly on the seat so she could rest her head against her knees. She was around fourteen years old with wild, red hair, and in her right hand she tightly clutched a cell phone and pair of earbuds.  Again, she began staring down at the eyes on her shoes.  Her parents sat on the couch near the back of the room, and Dr. Wilson began the session.

Depressed teen girl with red hair looks down
(Image credit: Getty Images/iStockphoto)

When Dr. Wilson said that this patient would rarely speak, he was not exaggerating.  He spent a few minutes asking her about what had happened, but each question hung in the air unanswered.  Eventually, he directed his questions to her parents.  They seemed concerned, and equally defeated, as they told him that she would not speak with them about it either.  As they spoke, Sarah remained silent, staring at her shoes and methodically winding her earphones around her fingers and palms.  Dr. Wilson expressed his concern about her unwillingness to speak to him and offered to find a psychiatrist that the girl felt more comfortable opening up to.  Her parents assured him her behavior today was not unusual; she had a long history of selective mutism in the presence of medical professionals.  During visits to her primary care physician and even during a recent urgent-care visit for a sprained ankle, she refused to speak to any doctors or nurses.  He sat quietly for a moment, thinking.  His expression suddenly changed as he stood up and said, “Mom, Dad; let’s take a walk.”  Before they left, he said to me, “You’re going to do this.”  I was caught off guard, but I nodded, grateful that he was allowing me to conduct the interview.  I felt nervous, too, because I knew my prospects of making a breakthrough with the girl were dim.  All but me and Sarah left the room, and I walked around the desk to sit in the doctor’s armchair.  

I watched her as she wrapped her headphones around her hands again and again while staring at her shoes.  I attempted to revisit Dr. Wilson’s earlier questions with her.  “Can you tell me what happened this week?” Silence.  “Why did you have to go to the hospital?” … “How have you been feeling lately?  What is your mood like?” More silence.  I began to feel discouraged and acutely aware of how long the others had been gone.  I tried one last time, with a slightly different approach.  “I know you don’t want to talk.  And I know it’s kind of scary being in a doctor’s office.  But actually, I’m not a doctor yet!  I’m still in school, just like you.  If you tell me what happened, it will help us come up with a plan to help you feel better.  Can you tell me?  Did something happen this week?”  As her gaze stayed fixed upon her shoes, she nodded her head.  

The movement was so slight, I almost didn’t notice, but she had nodded her head and finally answered one of my questions.  Suddenly, my hope was renewed that we may be able to communicate after all.  Careful to only ask yes or no questions, I asked about her family, her home, and her school.  She nodded and shook her head appropriately, and all the while furiously wound her earphones around her fingers, around her hands, around her knees.  I then asked about her friends.  She froze.  With the cessation of her movements, I noticed the faint horizontal scars on her wrists.  I was surprised that I had not noticed them sooner, and then wondered if the systematic winding of her headphones was not absent-minded fidgeting, but perhaps a very intentional distraction.  I delved a little deeper and eventually learned that her best friend, her only friend, had moved to another state this week.  Her eyes, still fixed on her Converse, began to well up with tears.  One escaped and traced an uneven river down her face.  She did not move to brush it away.  For a moment, she and I both stared at the eyes on her shoes in silence.  Her multicolored laces were covered in stars, and I briefly wondered if those were the laces she had turned to in a moment of despair.  My stomach turned, and I felt tears spring into the back of my own eyes as I imagined how she must have felt.  In that moment I realized that I will never know all the details of her past, or the depths her depression brings her to.  I can try to understand, but I never truly will.  We carried on, and although her tears would occasionally be too many to be contained by the brim of her eyes, no sound ever escaped from her.  Not a sob, not a sniffle, nothing.  It was as though she was purposefully refusing to make a sound. 

Our communication rested on a delicate balance of safety and trust, and a knock on the office door disrupted the scale and signified that our interview had come to an end.  I spoke with Dr. Wilson in the hallway about the information I had gleaned from our near one-sided conversation.  We reentered the room, and he and Sarah’s parents spent the rest of the appointment discussing his treatment recommendations.  As they talked, Sarah and I sat next to each other near the doctor’s desk.  The appointment ended, and I said goodbye to her and watched her unique pink sneakers pace silently out of the room.  Dr. Wilson shared his optimism at the small breakthrough we had seemed to make.  I, however, felt disheartened as the young girl left, knowing that her illness was severe and her struggle with depression would likely be a lifelong battle.  As if sensing my deflation, he said with a smile and a shrug, “Progress is progress.” 

As my psychiatry rotation moved forward, I interviewed a diverse cohort of patients with a variety of psychiatric issues including depression, panic disorder, PTSD and schizophrenia, among others.  It wasn’t long until I realized that no specialty fascinated me more than psychiatry and its patient population, and I decided what my path in medicine would be.  Often, I reflected on my interview with Sarah; the girl with the eyes on her pink sneakers.  I interviewed plenty of patients who were somewhat difficult to communicate with, whether they were reluctant to discuss certain aspects of their history, or they were psychotic and required extra patience to complete a psychiatric evaluation, but she remained the only patient I ever interviewed who refused to speak at all.  But even without words she had taught me an incredibly important lesson for my future career as a psychiatrist: progress is progress.  

I also still had a lot of questions about Sarah.  I wondered what her personality would be like if things were going better.  I wondered whether her motivation to remain silent came from a place of fear or apathy.  The more I contemplated, the more I remembered the vibrancy of her shoes.  They almost didn’t match their wearer.  But expression is not something that necessarily requires being verbal.  Perhaps this girl who was in a very dark place mentally found it easier to express her personality through her choice of foot-wear.  She never made eye contact with me throughout our interview, but maybe the eyes drawn on her shoes conveyed that even though her mouth was tightly shut, her eyes were wide open.  Or maybe she spent so much time staring at the shoe’s eyes because they made her feel seen.  This is all of course speculation, but expression is variable, and oftentimes truly understanding our patients requires paying attention to even the smallest of details. 

Towards the end of my rotation, I spent a week at the State Hospital, where the most ill and indigent psychiatric patients in the state received inpatient care.  Some patients at this facility had criminal charges against them.  On my first day there, I went to the court yard to meet and interview one such patient.  As I approached him, I tried to think of what I could say to make a connection with him; to start things off on the right foot.  He was sitting in a chair alone, wearing a black hoodie, sweatpants, and a pair of vibrant blue tennis shoes.  He looked up at me, and as he did, I smiled warmly and said, “I like your shoes!” 

Fictional names were used to preserve confidentiality.

Erin Yancey is a fourth-year medical student in the UAMS College of Medicine.

Filed Under: all, Non-fiction

The Revolution in Neuroscience

by Edgar Garcia-Rill

The Aim of Science

 Numerous journals published each month contain hundreds of articles addressing diseases, clinical care, and novel therapies.  This is a persuasive argument in favor of brain research.  Not only do neuroscientists feel overwhelmed by the proliferation of their own literature, but also the sheer number of “breakthroughs” published adds to the inordinate weight of the competition.  We should remember that most theories are actually proven wrong, and that is “business as usual” in science.  Considerable patience is needed to ensure that “breakthroughs” are properly replicated, validated, and accepted.  

Science, after all, is the search for better answers, not absolute truth.  The aim of science is to achieve better and better explanations.  Sir Karl Popper proposed that a hypothesis can only be empirically tested but never proven absolutely true, and that it can only be advanced after it has been tested 1.  It is perfectly acceptable for a scientist to be wrong, as long as he is honestly wrong — that is, as long as he or she designed and performed the experiment honestly.  
              
Popper also advanced the concept of falsifiability.  The honest scientist should apply this concept to his or her own theories and research findings.  He or she should be the best critic by probing weaknesses so that, by surviving withering criticism from the one scientist with the greatest familiarity with the experiment, the hypothesis can come closer to the truth.  However, few scientists actually throw down the gauntlet in such research.  Many of them defend their work with desperation and viciously criticize opposing theories.Some even censor the work of opponents by rejecting their manuscripts or grant applications.  Logic would demand that a scientist should strive to prove his or her own work false before someone else does, but that feat is difficult to accomplish during the typical 3- to 5-year period of a grant award.  In other words, the funding granted for an idea requires supporting evidence and success so that a grant may be renewed for another similar span of time.  Few “breakthroughs” can be proven correct (or incorrect) in such a short period; thus, the argument goes, more studies and a further funding period- are often needed.  Review committees face the task of precluding applicants from overselling their work.  Generally, reviewers actually agree on the quality of an application; however, they tend to shred weak applications although in some instances unworthy grants get funded anyway.   Conversely, due to the shortage of funds, many worthy projects instead go unfunded.

Sometimes, a novel technique has excellent “wow” value and yet can hide weaknesses.  These flaws may take time to be exposed, especially when reviewers jumping on the bandwagon defend it out of self-service.  Some “exciting” methods can be adopted wholesale by an entire field without due consideration for proper controls.  On occasion, the individual is so well respected that mere reputation can hide minor weaknesses.  There is also the “halo” effect from being in a top 20 medical school, an effect that can provide enough of a nudge to get an award funded, although it may not be better than one dredged from the backwaters of science.  The question is this: will any of those awards lead to a major breakthrough, a new cure, or a novel effective therapy?  The answer is that we do not know.  But we do know that only a very few will provide a significant advance, but if we do not fund the research, we relegate our lives to the status quo with no options for the future.

So how can we determine which science to fund?  How can we be certain which discovery is closer to the truth?  How can we identify the finding that will lead to a new cure?  We can design ways to do all these things better, but never with absolute certainty.  A good starting point is the realization that we can be “snowed,” at least for a while.

Famous Neuroscientific Theories

The “Blank Slate” theory proposed by thinkers from Aristotle to St. Thomas Aquinas to Locke suggested that everything we know comes from experience and education, nothing from instinct or natural predisposition.  Many of the proponents can be forgiven for advancing a “nurture or bust” philosophy since genetics was not in their lexicon.  That is, they had incomplete knowledge.  An avalanche of data has shown that many traits are inherited along with many instincts, the “nature” argument, we know, is not exclusive of nurture.

At the beginning of the 19th century, “Phrenology” proposed that specific traits could be localized to distinct regions of the skull overlying the brain, creating detailed cortical maps of these functions.  These advocates exceeded the available data, and in many cases used the process for ulterior motives, including racism, to spread their influence.  By the 20th century, such pinpoint assignations of skull regions had been discredited.

Another fallacy is that people “use only 10% of their brains,” an assumption deriving from a misunderstanding of studies of sensory-evoked responses in which “primary” afferent input (e.g. vision, touch, hearing) only activates a small percentage of the cortex.  This result sidesteps the fact that most cortex is devoted to association functions that process such information both serially and in parallel.  Embedded in this conclusion is the fact that neurons need to fire, otherwise their influence on their targets is lessened.  Without reinforcement, synapses weaken, almost as if the input was “forgotten.”  “Use it or lose it” is the principle of brain activity.  What this means is that our brain is continuously active — all of it.

Contrary to what many researchers espouse, many drugs shown to be efficacious in animals manifest limited effectiveness in humans 2.  In fact, the sensitivity for most drugs tested on animals has the probability of only a coin toss (~50%) that it will be effective in man.  Unfortunately, the opposite can also be true.  Thalidomide, a drug tested in more than 10 species, hardly ever produced birth defects, except in humans 3.  

Many of these theories were not disproven because of scientific fraud or faulty experiments.  Most were the result of incomplete knowledge, which includes the common problems of study size, limited technology, etc.  We maintain that it is also the inadequate application of falsifiability by the proponents that might have prevented some of these spectacular failures.  This incomplete knowledge and inadequate application of falsifiability point to neuroscientists. They are responsible for these failures, failures which could have been avoided if falsifiability had been practiced.

Mixed in with such famous failures are a number of sophisticated and stunning discoveries about the brain.  At the turn of the 20th  century, Ramon y Cajal observed that the nervous system consists of individual cells, not a continuous network as was the thought of the time.  In contrast, Cajal’s work led to the description of the synapse and chemical transmission across the narrow clefts between neurons.  This description then led to the identification of a myriad of transmitters — some of which could alter behavior –, followed by the development of psychoactive drugs that modulate mood, movement, and other functions.  Pharmacological intervention soon allowed many patients to live outside an institution, eliminating the need for padded rooms and “lunatic asylums.”  

About  30 years ago, it was thought that humans were born with all the cells we will ever have.  It now seems that we lose cells throughout puberty, but we find the occurrence of neurogenesis in the adult.  The creation of new brain cells, a totally foreign concept until recently, is now accepted wisdom. How to control such generation is the topic of study in a number of neurodegenerative disorders.

In science, certain simple conclusions can have an unintended impact.  The conclusion that Benjamin Libet reached in his 1980s experiments on the Readiness Potential is one example.  Because the “will” to perform a movement appeared to occur before the actual movement, and the person was not aware of this intention, Libet concluded that we perform movements through “unconscious” processes.  Unfortunately, this conclusion led to another conclusion – a disturbing one —  that our subconscious was responsible for our voluntary actions.  By extension, this meant that there was no free will.  The implications for personal responsibility carried unwanted effects, including advancing legal arguments absolving miscreants of culpability.  However, in Libet’s work, the person studied was fully conscious, not unconscious.  Moreover, while awake, we are aware of our environment as we navigate it, even though we do not expressly attend to any particular event.  In other words, we are aware of cars and pedestrians as we carry out a conversation by often moving to avoid collisions.  In fact, we are “pre-consciously” aware of the world around us and respond appropriately, although we do not attend to a particular event.  This interpretation makes it clear that we are indeed responsible for our actions, for our voluntary movements.  However, it is also clear that the perception of that environment, whether pre-consciously or consciously, is altered in mental disease.  Disorders like psychosis can dramatically alter these perceptions and thus guide our actions without responsibility.  That is why proper diagnosis of mental disease is essential.

It is inarguable that brain research has led to remarkable improvements in health and quality of life.  The rather modest investment in funding targeting the brain has paid off exponentially.  While the National Institutes of Health are funded to the tune of ~$40 billion yearly for research from cancer to heart to brain, spending for defense research is more than 10 times greater.  While scientific review committees discuss, dissect, and agonize over a $1 million grant application for almost one hour, Congress makes billion-dollar defense funding decisions in minutes.  We should realize that the successes in brain research will far outweigh the failures, but we should also know that only some of those successes will result in a novel treatment.  

In addition, the annual recurring costs of most brain diseases in terms of medical costs, lost income, and care is in the billions of dollars.  One novel treatment for a disease that was derived from a typical $5-10 million research program will save billions of dollars every year.  We know that for every dollar spent on research, we stand to save thousands every year, and conversely, for every dollar we do not spend on research, we stand to pay thousands every year from now on.  

Famous Techniques

One of the most appealing techniques in medicine is magnetic resonance imaging (MRI), which employs strong magnetic fields stimulated by radio waves to produce field gradients that are then computed into images of the brain.  Functional MRI (fMRI) uses blood oxygenation levels to compute images that are assumed to reflect neural activity.  The standard black-and-white displays allow the clinician to detect and measure tumors, infarcts, and even infection, as well as bone, fat, and blood.  This technique has been a life-saver for a number of disorders in which clear, detailed, and accurate anatomical images are required.  With the advent of more sophisticated MRI computation and fMRI, the displays have become color coded, so that changes in blood oxygenation are displayed in beautiful false color images.   This characteristic allows proponents of the technique to oversell their product.

Today, fMRI is being used in research to make unwarranted conclusions about the workings of the brain on a real-time basis.  Researchers undertake studies from voluntary movement to sensory perception to the performance of complex tasks. Some labs have “concluded” that they can detect truth-telling from lying, and have pitched for fMRI as a lie detector.  The issues related to the technology will not be repeated here, merely to emphasize that the field has been remiss in standardizing the generation of images.  This has created a field in which the same experiment carried out by different labs produces different images and, therefore, conclusions.  The method suffers from a complexity requiring recurring individual decisions regarding the weights of factors which are applied differentially by researchers at multiple stages in the processing of that image.  It is incumbent on researchers in the field to develop standardized methodology.  Perhaps the most serious problem is that the technique actually measures blood flow, not neural activity.  The pretty images represent the aftermath of brain processes that included both excitation and inhibition.  The fMRI is essentially a static image of an ongoing complex event, much like taking a picture of an orchestra and, from the frozen positions of the players, making conclusions about the identity of the musical piece being played.

Granted this illustration may be an exaggeration, but the fact remains that the mesmerizing effect of the images hide the fact that they are based on moving processes founded on assumptions about how the brain works.  Moreover, overselling of the technology has accumulated an undeserved portion of the funding pie.  Many have naively moved to the technology without developing testable hypotheses and controllable experiments.  The monies for the BRAIN Initiative, a monopoly of funding for the method, has been hijacked to the detriment of other valuable technologies.  It is hoped that the limitations of the method will be exposed so that those using more esoteric variables can better justify their decisions.  Currently, the value of the technology to the clinical enterprise is without question, but when complex neural processes are studied using what is a measure of blood flow, the conclusions drawn can indeed be questioned.

One policy issue that emerges is the following: what is the harm in throwing money at the problem?  Why not overfund a research area until all the problems are worked out?  The answer – these practices are unrealistic.  A similar situation arose when agencies began pouring money into AIDS research.  Funding levels that historically had funded between 5% and 15% of submitted AIDS grant applications- rose to allowable funding of 20% to 25% of applicants.  The pressure increased to make breakthroughs, and “discoveries” came hard and fast, with seemingly rapid progress towards systematically resolving the problems of a complicated infectious process.  Responsible labs were soon confronted with the realization of improperly controlled “discoveries.” These labs began spending resources and time on validating questionable results and unsupported theories.  Some were forced to attempt to replicate many such findings in order to move the field ahead, if at all. This consequence led to overfunding from which the field suffered. 

Another technique with which the public at large, including attorneys, is enamored is genetics.  This powerful array of methods has exceptional promise.  As future clinical tools, personalized medicine stands to provide answers to a host of medical questions and may even give us some cures.  But there is the issue of genes and determinism.  That is, genes are not deterministic, but very malleable, likely to produce different proteins under slight changes in condition.  In addition, genes are co-dependent, such that the expression of some genes is not only dependent on other nearby genes (in terms of chromosome location), but also on some distant genes. 

The field of genetics promises to address the links between genes, the brain, behavior, and neurological and psychiatric diseases.  Therefore, neurogenetics holds great promise for the future of clinical science, but it also has created a gap.  This promise has attracted the bulk of funding for genetic studies, pushing the testing of treatments and cures  —  that is, translational research —  further into the future, resulting in a gap between patients who need to be treated now and those who may be successfully treated with a genetic intervention in 20 or 30 years.  

Research grants have migrated away from clinical studies towards molecular studies.  Because of the complexity of the genome, short-term answers are unlikely.  Premature genetic interventions could be catastrophic, but the power of the technology has moved funding away from interventions in the clinic.  Translational neuroscience is designed to bring basic science findings promptly to the clinic 4.  It is a response to an Institute of Medicine report from 2003, calling for more emphasis in this area 5.  The reason for the concern voiced in the report was the gradual decrease in research grants awarded to MDs (presumably doing research on patients) compared to PhDs (presumably doing research on animals).  Over a 10-year period, the percentage of MDs with awards had decreased from 20% to 4%.  While some attention has been paid to increasing translational research funding, the fact is that most grant reviewers are basic scientists and not very familiar with clinical testing and human subject research.  Animal studies are more easily controlled than human subject studies, so that there is an inherent difference that makes for lower funding scores for human studies.  It is incumbent on the research community to correct the discrepancy because we stand to lose public trust.  We now live in a world of immediate gratification and cures far off in the future will not be warmly considered.

Is the emphasis on genetics and molecular biology truly warranted?  Definitely, but not at the expense of advances that could improve the quality of life of patients now rather than later.  Some self-scrutiny is called for from the molecular biology community.  For example, researchers should realistically identify some of the limits of their own technology.  One area that needs such scrutiny is the knock-out mouse, in which a genetically modified mouse has undergone a process whereby a gene is kept from expressing or deleted from the mouse’s genome.  Knocking out the activity of a gene provides knowledge about the function of that gene, making for a marvelous model for the study of disease.  The process is complex, certainly cutting-edge, and  very effective if properly employed.  Because of the variety of genes, the technology has created an opportunity for many labs to develop their own knock-out mouse, thus leading to a myriad of new genetically modified mouse lines that researchers can make, buy, study, and manipulate.  

The scientists who developed the technology won the Nobel Prize for Physiology or Medicine in 2007.  The knock-out technology has to date been most successful in identifying genes related to cancer biology.  These genetically altered animals allow the study of genes in vivo, along with their responses to drugs.  The problem, however, has been the inability to generate animals that faithfully recapitulate the disease in man.  This glaring factor is understated- to the detriment of all.  In addition to the glaring fact that ~15% of knock-outs are lethal and some fail to produce observable changes, there is also the overlooked fact that knocking out a gene will up-regulate many other genes and down-regulate another large group of genes 6.  In nature, single-gene mutations that survive are very rare, so the knock-out is not simply a study of such mutations; it is an attempt at learning all that a single gene does.  The problem is that, without knowing which OTHER genes are up- or down-regulated, the knock-out animal represents an uncontrolled experiment on a creature that never would have existed in nature.  It is incumbent on representatives in the field to discuss these factors and adequately control their studies.

Some of these problems can be overcome by using conditional mutations, in which an agent added to the diet can induce a gene to be expressed or cease expressing temporarily.  The problem is that this approach does not control the up- or down-regulation of linked genes whose identities are unknown.  Moreover, none of these methods measure compensation.  Very few researchers verify how the absence of the gene creates compensation in expression of other genes.  For example, knocking out the gap junction protein connexin 36 creates a mouse without connexin 36, but the manipulation leads to overexpression of other connexins 7.  The field of knock-out mouse lines is expanding, growingly uncontrolled, and funded well beyond its current scientific affirmation.

A final issue is that, as far as the brain is concerned, protein transcription is a long-term process.  That is, the workings of the brain are in the millisecond range.  Over the last several minutes during the reading of this article, transcription was irrelevant.  None of the perceptive, attentional, or comprehensive elements of the information on these pages required gene transcription.  Of course, the long-term storage of the information into memory requires gene transcription, but not before.  Our brain takes about 200 miliseconds to consciously perceive a stimulus.  Gene action is in the order of minutes to hours, which is not in the same scale in terms of time.  Gene transcription is irrelevant during a conversation with friends about the latest news.  Assessing thought and movement in real time is too fast for genetic methods, but not for two technologies, the electroencephalogram (EEG) and the magnetoencephalogram (MEG).  

The EEG amplifies electrical signals from the underlying cortex (just from the surface of the brain, not from deep structures), but these signals are distorted by skull and scalp.  The MEG measures the magnetic fields of these electrical signals, but requires isolation by recording rooms, massive computational power, and superconductors that function in liquid helium.  The development of helium-free MEGs is here; such development would make the technology less expensive to operate.  The MEG is also very useful in producing exquisite localization of epileptic tissue, especially the initial ictal (seizure activity) event.  As such, it is reimbursable for diagnostic and surgical uses.  As the only real-time localizable measure of brain activity, the MEG is likely to make inroads into the rapid events in the brain.  Recent reports suggest that the MEG may also provide detailed images of any part of the body, including functioning muscle.  

The Revolution

To gain perspective, we have to understand the battle within the brain sciences that has led to the current state.  Subsequent to Sir Isaac Newton’s deterministic theories of how the world worked, there arose the idea that the brain worked the same way.  That is, – all brain function could be reduced to the smallest physical components, to the ultimate in micro-determinism.  This approach was manifested in the brain sciences in the form of “behaviorism,”an idea that all actions and thoughts were due to the physicochemical nature of the brain.  A major proponent was B. F. Skinner, who considered free will to be an illusion, and that everything you did depended on previous actions.  Advances in molecular biology and the structure of DNA fanned the fervor for this view.  There was no room for the consideration of consciousness or subjective states.  This was the world of the reductive micro-deterministic view of the person and the world.  These views influenced education and policy, suggesting that the issue was not to free man but to improve the way he is controlled — the “behaviorist”, one-way reductionist, view of the world in general and of the brain in particular.  

The implication of “behaviorism” for thought and action was that consciousness was an epiphenomenon of brain activity- and that the reductionist approach, if only enough details were known, provided a complete explanation of the material world.  This deterministic view of the world began to crumble under the weight of the discoveries of quantum mechanics.  The old deterministic idea of behavior and absence of free will were undermined by the advances in quantum mechanics.  Behaviorism thus was replaced by a “cognitive revolution” that espoused mental states as dynamic emergent properties of brain activity.  That is,  a two-way street existed between consciousness and the brain, fused to the brain activity of which it is an emergent property.  This is not to imply a dualism, two independent realms. Rather, mental states are fused with the brain processes that generate them.  This approach eliminated the duality of “brain”  versus “soul” or “mind.”-.  Just as evolution undermined the tenets of creationism, the cognitive revolution dissipated the suspicion of a separate “soul.”

The “cognitive revolution” implied a causal control of brain states downwards as well as upward determinism.  This two-way approach offers a solution of the free will versus- determinism paradox.  This cognitive approach retains both free will and determinism, integrated in a manner that provides moral responsibility 8. As a current scientific mainstream opinion,  volition remains causally determined but no longer subject to strict physicochemical laws.   It is one that no longer considers there to be a mind-versus-brain  paradox, but a singular, functionally interactive process.  Instead of placing the “mind” within physicochemical processes, thought became an emergent property of brain processes.  

To use a simplistic parallel, the brain is to thought and action as the orchestra is to music.  Thought and action are emergent properties of the brain just as music is an emergent property of the orchestra.  Music cannot exist without the orchestra, just as thought and action cannot exist without the brain.  This is a solitary relationship, one in which brain states influence thought and action (downwards), and the external world modulates the activity of the brain (upwards).  The “mind” or consciousness can be viewed as downward control of a system changing due to continuously impinging external inputs. 

This new viewpoint combines bottom-up determinism with top-down mental causation, the best of both worlds.  The world of reductionism is not entirely rejected, merely considered not to contain all the answers, and an entirely new outlook on nature is manifest.  The revolution in neuroscience has provided new values, whereby the world is driven not just by mindless physical forces, but also mental human values.  

References

[1] Karl R. Popper. 1983. “Realism and the Aim of Science.” In Postscript to the Logic of Scientific Discovery. W.W. Bartley III ed. New York: Routledge.

2 R. Heywood. 1990. “Clinical Toxicity – Could it have been predicted? Post-marketing experience.” In Animal Toxicity Studies: Their Relevance for Man. C.E. Lumley, and S. Walker, eds. Lancaster: Quay.

3 Niall Shanks, Ray Greek, and Jean Greek. 2009. “Are animal models predictive of humans?” Philos. Ethics Humani. Med. 4: 2.

4 E. Garcia-Rill.  2012.  Translational Neuroscience: a guide to a successful program. New York: Wiley-Blackwell.

5 Kohn, L.T., ed. 2004. Committee on the role of Academic Health Centers in the 21st Century, Academic Health Centers; Leading change in the 21st century. Washington, DC: National Academies Press.

6 D.A. Iacobas, E. Scemes, and D.C. Spray. 2004. “Gene expression alterations in connexin null mice extend beyond gap junctions.” Neurochem. Int. 45: 243-250.

7 D.C. Spray, and D.A. Iacobas. 2007. “Organizational principles of the connexin-related brain transcriptome.” J. Memb. Biol. 218: 39-47.

8 Roger Sperry. 1976. Changing concepts of consciousness and free will. Perspect. Biol. Med. 20: 9-19.

Edgar Garcia-Rill, Ph.D., is the Director for the Center for Translational Neuroscience and a professor in the Department of Neurobiology and Developmental Sciences at UAMS.

Filed Under: all, Non-fiction

Love in the Time of Cholera

Photo of Gabriel Garcia Marquez
(Image credit: © Pablo Corral Vega)

Gabriel Garcia Marquez

“An Appreciation of Love, Aging, and Cholera”

By Richard Ault

The Man ….

Gabriel Garcia Marquez, perhaps the most honored and well-known Latin American novelist of the modern age was born in Aracataca, Colombia, in 1927.  These origins identify him in Colombia as a “Costeño,” a native of the Caribbean coastal region of the country known for its color, vibrancy, and the rhythm of its music and language, contrary to the dreary, wet, mountainous interior where the capital city of Bogota is located.  His Costeño origins will loom large in his life, his journalism, and his fiction.  To Americans the most accessible entry to “Costeño” culture is Cartagena, a coastal city frequented by American cruise lines.  

Marquez lived a childhood filled with considerable instability.  His father was an itinerant pharmacist/homeopathic quack, and while his family still remained partially intact despite the instability he was raised frequently by his maternal grandparents in Aracataca, and moved occasionally to Barranquilla, the other primary Costeño city.  These locations are significant because most critics identify the unnamed city at the center of our story as an amalgam of Cartagena and Barranquilla.  In my opinion, the most important influence on young Gabriel was his grandfather, Gabriel Eligio Garcia, a man greatly respected in the Costeño region for his refusal to remain silent about political atrocities during Colombia’s seemingly endless civil war, in particular the massacre of perhaps as many as 3000 banana workers by thugs employed by the infamous United Fruit Company. These atrocities occurred the year after Gabriel’s birth.  Indeed, his first novel, Leaf Storm, is a searing reimagination of these events.

After his secondary school graduation in 1947, Gabriel’s higher education took an uncertain path.  He spent two years studying law at the National University in Bogota, largely to please his father.  He transferred to the University of Cartagena in 1950 after the National University closed during a period of particularly intense political violence which is, after fùtbol, Colombia’s national pastime.  His dedication to his legal studies was essentially non-existent and there is no record he ever attended a class in Cartagena.  He gravitated to journalism at this time, and it is the profession that supported him, more or less, until the publication of One Hundred Years of Solitude led to fame and some fortune.

Unsurprisingly, Gabriel Garcia Marquez was an outstanding journalist, pioneering what was, in effect, investigative journalism and, probably years before the term was even coined America.  Similarly, Gabriel’s role as a journalist in Colombia presented a precarious way for him to make a living because of the instability of many of the country’s newspapers and magazines, to say nothing of having to navigate through the dizzying array of loyalties and betrayals in the seemingly never-ending “La Violencia” which gripped Colombia for much of the 20th century.  Ultimately “La Violencia” morphed into the Narco-terrorism of Escobar, Ochoa, and Lederer.  Luckily for Gabriel, by the time the Narcos came to prominence he had made his fortune and spent very little time in Colombia.  Otherwise, it is likely that Pablo Escobar would have had him assassinated for dangling a participle.

Before we move to more literary concerns, we must speak a little about Gabriel’s politics.  It will come as no surprise to anyone that he was a leftist and an undying admirer of Cuban dictator, Fidel Castro.  Of course, we must place the leftist politics of almost all Latin American intellectuals into proper context.  The sad reality is that without a democracy-promoting middle class, given the unholy alliance among the great landowners, the Church, the Military, and whatever western capitalists were relevant in a country (United Fruit, Anaconda Copper, Royal Dutch Shell, etc.) on the right, and the Socialists and Communists on the left, the choice was easy.  And it was made easier still by the power and bumptiousness of the Great Gringolandia to the north. When Castro came along and thumbed his nose at the Yanquis, it was true love for Gabriel, a love that never wavered even after Castro revealed himself to be the murderous thug that he was.  Gabriel’s later political journalism was marked by pirouettes and tight-rope walking worthy of a Wallenda to justify “La Revolución.”

And His Literature …

Throughout much of his professional life Gabriel (or Gabo to his friends) lived an almost hand-to-mouth existence, supported, such as it was, by his journalism.  Despite the fact that he had been publishing fiction regularly since Leaf Storm in 1955 (it took him seven years to find a publisher) he saw very little material success until the publication of One Hundred Years of Solitude in 1967.  This magnificent book ultimately sold over 30 million copies, led directly to the award of the Nobel Prize for Literature, and secured his financial future.

Of course, you can’t get far into a discussion of Gabo without coming to terms with the concept of magical realism.  There are about as many definitions of this term as there are critics writing about it, though they all revolve around magical, fantastical, or supernatural themes and events woven into realistic or even mundane circumstances.  As another essayist noted it is literature that suspends the physical laws of time, space, life, and death in otherwise realistic circumstances.  For reasons that I am not capable of understanding magical realism has bloomed spectacularly in Latin American literature and Gabo is only one of many practitioners.  The best I can do to understand magical realism is to propose two possible contributing factors to the phenomenon – First,  the undeniable music in the Spanish language; second, the rampant despair and poverty in so much of Latin America which might promote a desire to escape the crushing reality.   

Though critics may not consider Gabo the foremost magical realist, the broader world certainly does, all attributed to One Hundred Years of Solitude.  This soaring multi-generational tale of the Buendia family and the Costeño town of Macondo is a breathtaking book. Yet this is, after all, a contribution to a medical literary journal, and I think I can better justify “El Amor en Los Tiempos del Cólera,” as at least tangentially appropriate to this journal’s themes.

I must admit that my absolute love for this book may be a bit unhinged.  In my reading life, very few books have affected me as much as Cholera, and I have pondered why this is.  I was particularly concerned when my wife, who is not an unsophisticated reader, was profoundly unimpressed with this book.  I have thus decided that perhaps only men can be true romantics, and in my opinion the absolute best audience for this book is seasoned men who have experienced love and loss.  So, yes possibly, I’m a bit biased. Perhaps it’s just good enough to say that I am a sucker for great characters, and I have never met a more compelling set in my five-plus decades as a reader.  Florentino Ariza, Fermina Daza, and Dr. Juvenal Urbino came vividly alive to me, and even the lesser characters – Aunt Escholastica, Lorenzo Daza, Transito Ariza, Jeremiah de Saint-Amour, and even that unnamed, malevolent Parrot provide a delicious richness that makes this book, for me, a transformational joy.

While in One Hundred Years, the magic is knitted deeply into the characters and the narrative at literally every point in the novel, Cholera, like several other Marquez books, wears its magic lightly. The book does have its moments – the preternaturally linguistic parrot; Fermina’s dreadful growing doll; the whistling scrotal hernia, the 622 serious assignations in 51 years, 9 months and 4 days (certainly magical for most of us), and my personal favorite:  Florentino’s utter inability to compose a simple business letter, despite his immense talent writing love missives for himself and strangers as well in the “Arcade of the Scribes.” The magic in Cholera is an accompaniment to a story of love, obsession, and a bittersweet tale of aging.

And Now to “Cholera”

While to most it is the Feast of the Pentecost, for us it turns out that it is, in fact, “La Dia de los Muertos” as we meet Dr. Urbino, attending to the pre-arranged suicide of his friend, Jeremiah de Saint-Amour, who has died, according to Dr. Urbino, of  “Gerontophobia.”  And before this busy day ends the machinations of a malign parrot lead to the death of Dr. Urbino himself. In a few pages we see death at both ends of a bizarre spectrum – one a tightly planned, prepared, and intellectually justified exercise in self-destruction, and the other due to capricious and thoughtless chance, though clearly leavened by a dose of arrogance.  At the end of these opening pages only Dr. l Urbino’s wife, the regal Fermina Daza remains.  And just as memorable at 72, “Her clear almond eyes and her inborn haughtiness were all that were left to her from her wedding portrait, but what she had been deprived of by age she more than made up for in character and diligence.”  She is, we quickly see, a woman of substance who could only stir strong feelings in men.  But at this point little do we know how deep these feelings are in the heart of Florentino Ariza, our soon-to-be met protagonist.

A few pages further on, we discover that Fermina Daza and Florentino Ariza were the victims of an erstwhile love torn apart by cruelty, class envy, and, perhaps in that time, inevitability.  This drama launched them on two separate, but intertwined, paths.

Fermina Daza, of course, becomes the wife of Dr. Urbino, and he was the catch of the century in our Costèno city.  He is a classically trained physician, a man of culture and refinement, and at least outwardly, a gentleman of the highest character.  Above all, he is a man of the future.  Throughout his distinguished medical career he advocates for modern medical practice, and in the seemingly unending battle against cholera, he is at least the savior of the city, if not the country. The marriage of Dr. Juvenal Urbino and Fermina Daza, though outwardly perfect in every way, is not without its challenges.  Both are strong-willed which inevitably leads to small tensions and strains, though in the case of Dr. Urbino his stubbornness is multiplied by more than a whiff of childishness.  He also succumbs to a dalliance with Barbara Lynch, Doctor of Theology, and this arrangement almost, but not quite, destroys his marriage. Despite these travails, Dr. Urbino and Fermina lead a successful and fulfilling life, and his last words to her are, I think, fitting – “You will never know how much I loved you.” That is a great exit line! 

If one were worried that Fermina Daza’s widowhood would be empty and forlorn, leading to inevitable decline, one would not have taken the measure of Florentino Ariza, whom we shall now meet in depth. Just as Dr. Juvenal Urbino is a man of the future Florentino Ariza is indisputably a man of the past. He is the bastard son of Don Pius V Loayza and the remarkable Transito Ariza.  Don Pius V discreetly provides for his son, but Florentino Ariza also benefits from his mother who owns a small notions shop, and also makes a very good living as a shylock to distinguished families who have fallen on hard times and cannot bear to borrow within their social circle.  Don Pius V Loayza further insures a position for Florentino Ariza at The River Company of the Caribbean, where he is mentored professionally by his uncle, Don Leo XII Loayza, and carnally by Lotario (interesting name choice)  Thugut, that rarest of all things, a randy German.

Despite having the material advantages to adopt the accoutrements of his times, Florentino remains sartorially a creature of the past – Formless home tailored black suits, high celluloid collars, hats of no style.  And above all, he wears a perpetually mournful visage.  A notable highlight of the adaptation of the book is Javier Bardem’s spot-on presentation of Florentino Ariza’s doleful passage through life in spite of the mediocre film depiction.  

After Florentino Ariza and Fermina Daza’s chaste youthful love affair is dashed by Fermina’s social climbing and thoroughly unpleasant farther, Lorenzo Daza, Florentino  vows perpetual fidelity to his lost love, and goes on to violate it in the most spectacular way, though always keeping Fermina in his heart, if not in other organs.  This is where the book becomes confounding.  Less sophisticated readers will deem Cholera a love story, but this is unsatisfying, indicating a most superficial reading that is, frankly, troubling to me. The novelist, I think, plays with us a bit here and provides Florentino with clearly romantic traits.  He is both a poet and a lover of poetry; he serenades his beloved (of the moment) with his violin; and that sweet, sad face – what women wouldn’t fall for it!  Indeed, as Florentino begins to stray from his vows to Fermina, we at first view his conquests as almost comical.  It’s as if comedy legend Stan Laurel were a secret Casanova.  But, after his assignations with the Widow Nazaret, Auscenia Santander, and the woman from the insane asylum, events take a dark turn.  We discover that Florentino Ariza, under the tutelage of Lotario Thugut, supplements his consequential assignations by “hunting little birds.”  Now, to me, the term “hunting” is jarring and evocative of 20thcentury psycho-sexual killers more than our slightly absurd hero, and this more or less ruins the notion of a love story for me.

As we begin to doubt Florentino , events go from sinister to deadly.  First, a thoughtless branding of Olimpia Zuleta’s belly leads to her gruesome murder at the hands of her enraged husband.  Then, as Florentino Ariza’s 51 year, 9 month, and 4 days of combined vigil and bacchanal is approaching its end, things get downright creepy as America Vicuna enters our story.  She is a fourteen year old blood relative of Florentino Ariza, though the exact degree of consanguinity is never established.   Inevitably, she enters Florentino Ariza’s bed, making him a serious competitor to Lolita’s Humbert Humbert as the worst legal guardian ever!

It is in the midst of his dalliance with America Vicuna that Florentino Ariza learns that Dr. Juvenal Urbino has met the parrot with his name on it, and Fermina Daza is now free.  Without a thought for America Vicuna and with cosmically awful timing, he bursts into the funeral to declare his undying and eternal love to the newly minted widow.  She, in turn, with natural imperiousness, rejects him in no uncertain terms.   But, of course, our story does not end here.  As she settles into her widowhood Fermina begins to ponder the fact that she has lost her identity as anything other than Mrs.  Juvenal Urbino, and that she now possesses perfect freedom to live the rest of her life as she pleases. Once again our story turns to romance, even if it is dimmed by the ravages of age.  Step by step, Fermina Daza and Florentino Ariza are on a path leading inevitably to the luxurious cabin on the riverboat “New Fidelity.”  “New Fidelity!” Really!

As our story comes to its floating climax there is still much to ponder.  Shortly into the voyage, Florentino  receives the fateful news that America Vicuna is dead by her own hand, and he internally vows that all he can do is stay alive and “not allow himself the anguish of that memory.”  Is this an act of great self-control, or does it reveal him as nothing more than a monster?  The river voyage provides us with a close-up look of love accommodating the realities of old age.  The last pages are replete with images and events that shape the future for the aging Florentino and Fermina including the hostility of her children, her loss of hearing, and her lament, “I smell like an old woman!”  And, as the couple’s long-delayed passion is consummated   “… he dared to explore her withered neck with his fingertips, her bosom armored in metal stays, her hips with their decaying bones, the thighs with their aging veins.” Finally, there is their inexpert love-making, perhaps also qualifying as magical realism given the practice Florentino Ariza has had.

In the end, then, we are left only with cholera.  Its imagery pervades the book. Dr. Juvenal Urbino has been a tireless combatant against the disease, promoting modern treatment, and more importantly, modern sanitation.  His effort partially defeats the disease, particularly among the urban upper classes.  The poor, however, continue to suffer as they remain living in squalor.  The young Florentino Ariza, after his dismissal by Fermina Daza suffers from acute love sickness, the symptoms off which mimic those of cholera. However, in perhaps the most magical of moment in the book, the dread disease provides our lovers with an unlikely path to their future.  After the “New Fidelity” encounters a massive epidemic upriver and has raised the yellow plague flag to speed its return to the city, Florentino orders the Captain to turn around and speed back to the plague zone.  Incensed, the Captain asks, “And how long do you think we can keep up this goddam coming and going?” 

Florentino had kept his answer ready for 53 years, 7 months, and 11 days and nights:

“Forever,” he said.


Richard Ault, MHSA, is an Assistant Professor in the Department of Health Policy and Management of the Fay W. Boozman College of Public Health.

Filed Under: all, Non-fiction

University of Arkansas for Medical Sciences LogoUniversity of Arkansas for Medical SciencesUniversity of Arkansas for Medical Sciences
Mailing Address: 4301 West Markham Street, Little Rock, AR 72205
Phone: (501) 686-7000
  • Facebook
  • X
  • Instagram
  • YouTube
  • LinkedIn
  • Pinterest
  • Disclaimer
  • Terms of Use
  • Privacy Statement

© 2025 University of Arkansas for Medical Sciences