• Skip to main content
  • Skip to main content
Choose which site to search.
University of Arkansas for Medical Sciences Logo University of Arkansas for Medical Sciences
Medicine and Meaning
  • UAMS Health
  • Jobs
  • Giving
  • About Us
    • Submission Guidelines
  • Issues
  • Fiction
  • Non-fiction
  • Poetry
  • Conversations
  • Images
  • 55-Word Stories
  • History of Medicine
  1. University of Arkansas for Medical Sciences
  2. Medicine and Meaning
  3. Author: UAMS Online
  4. Page 15

UAMS Online

Essaouira

Image of a woman walking on a street in a Middle Eastern country

Jonathan Spradley

This image is a reminder of the intersectionality of healthcare disparities between geographic location and gender. Women, such as this one in Essaouira, Morocco, are tasked with difficult journeys of trying to participate in healthcare systems made inaccessible to them as doors are repeatedly closed in their paths.


Jonathan Spradley is a medical student at UAMS.

Filed Under: all, Images

Brilliante

Artistic image of musical notes drawn with light on piano keys

Jonathan Spradley

This image represents the “art of medicine” – the artistic and creative expression that can come from technical skills or seemingly black & white components. As with learning to play the proper keys on a piano to create music, the overall outcome of medical education should be to reach beyond scientific knowledge and create personalized, meaningful treatment plans and relationships with patients.


Jonathan Spradley is a medical student at UAMS.

Filed Under: all, Images

Orchard

By Elizabeth Hanson

Drawing of a heart growing from a tree branch

First, she told him that winter had come.
That the ground was hard
And the soil cold.
That the seeds were stowed safely away
Somewhere deep, waiting
For the warm season to come.

Then, she told him the snow had melted.
That the orchard was a lake
And the earth too wet.
That they must wait for it to dry
Because seeds, well, they couldn’t swim!
They needed dirt, soft yet strong
Where roots might anchor,
Tangle and grow.

Next, she told him that a daffodil had bloomed.
That it stretched tall,
And yellow beneath the orchard’s branches.
That above, buds had formed,
Blanketing the bark
The way his quilt covered him.
If only he could hang on.

Last, she told him that his turn had come.
That he would sleep and then wake
With the heart that had grown
From the seeds they had saved.
That he might dream of the day
She had walked among the orchard rows
In the place where love grew on trees,
And hearts could be plucked
Like apples in the fall.


Elizabeth Hanson is a resident physician in the Emergency Medical Department at UAMS.

Filed Under: all, Images

A room with a view

By Erick Messias

Nobody would call him doctor anymore. It turned out to be harder to get used to Mister Maia than to Doctor Maia, which had happened to him after moving to America for medical residency some forty years before these days.  It gave him pause to hear Mister Maia, as if this was some long-gone relative from the old country. Yet, he was now Mr. Maia except to a couple of Mexican nurses who called him Señor Maia, trying to please him but only succeeding in irritating him a bit further. He did not correct them; it had been many years since the days he explained Brazilians spoke Portuguese, not Spanish, and their capital was Brasilia, not Buenos Aires.

Elderly man reflected in a window

The nursing home’s brochure had exaggerated, as marketing pieces usually do. In reality, damp and faded mini-apartments substituted for the airy and bright little cottages in the pamphlet. The staff seemed to make an honest effort to be polite and helpful, he recognized. His children also seemed sincere in believing the brochure and the apartment manager, a dignified and patient African American woman in her fifties.

The kids had helped their old man move to the nursing home after their mother, his wife for those same forty years, had died. He was now single again, and their children, one a computer engineer living somewhere in California, and the other a staff writer at some New York web-magazine, pleaded with him to leave the large house in the suburbs of Little Rock for the nursing home in the sprawling West Little Rock enclave, aptly named Mount Elysium.

He offered little resistance. He knew the alternatives were few, staying in the house impractical, and the invitations to move to California or New York sincere illusions. He knew there was no return path to the old country, where his childhood had once been and where his own siblings were still living. After all, Brazil was now the old country, where he knew few people and even fewer knew him. The old Little Rock house where they had raised their children, where he and his wife had had so many brief breakfasts and long dinners was too old, too big, and too full of used-up furniture and memories; much like him. The children did it all with American efficiency, and in little more than a week everything was gone; the house was on the market, and he moved to Mount Elysium nursing home.

His daughter had insisted he get a room with a view. The manager had promised her to find her father the best views of the whole complex. The daughter believed the manager as she had believed the brochure: wholeheartedly. His son had agreed and made sure everything was set up for him at the bank and with the real estate company. Mr. Maia drove both kids to the airport since he could still drive around town in the semi-new Toyota Camry. Returning from the airport, he drove back to the old house instead. Old habits, old pathways and turns taken for forty years had left deep impressions in his brain. He saw the “For Sale” sign, took a deep breath, and drove himself to the nursing home. They were not the kind of people to make a fuss about things.

He asked the guard at the entrance how to get to his new home, his benign incarceration in a room with a view, he told himself as he parked. He was sure it would be a view of some mountain ridge since they were far from the Arkansas River.

He had to admit the little apartment was not bad. It was clean, sparse and quiet. As a young man with literary aspirations, he would have called it spartan. He was finally alone after the back and forth of the last days with his kids in town. He set the clock in the automatic coffee maker for seven in the morning and went to bed at ten, falling asleep over a science fiction book. He fell into an exhausted slumber and woke up to the smell of fresh coffee.

The window was closed so he decided to finally check the view he had been promised. By his own estimation, it would be a front view to Pinnacle Mountain, one of the locally famous Arkansas landmarks. He opened the window and was surprised to see some thick tree foliage blocking whatever view would be in place.

He was working himself up to complain to the manager when he noticed something unusual about the tree. It was not one of the ubiquitous oak or pine trees seen in most Little Rock yards. He looked and looked and, after examining it carefully, finally recognized it was a Jambo tree, just like the one they had in front of their house in the Brazilian northeast coast so long ago. He looked at the elliptical, large, shiny leaves, at the curvy, bright red fruits; he remembered how difficult it was to keep the neighborhood kids from throwing rocks to get those fruits down. How many times his father had promised to cut down the tree to avoid the shower of stones when the tree was in season. He had not seen a Jambo tree in many years, and he now could go to his window and look at it. He looked further, past the foliage, and saw the tall gray wall that surrounded the front neighbor’s house, the wall that showed those neighbors did not think they belonged in the low middle-class street, the wall reminding them there were other, more important people in town. The neighbor was a district judge, or so he was told because he never saw him or his children, only a car coming in and out of the walled house. He looked to his right and saw the other house across the street, the one that belonged to the old lady who had married a Spaniard and whose father had given name to the street on which they lived. She was always trying to be friendly and make conversation, especially after her Spanish husband had left her for another younger Brazilian woman. Her house had a large backyard filled with goiaba and siriguela trees, and when in season, the kids, including him, would invade the orchard and have a tropical feast. He looked to the window sides and noticed the same chipped, cheap paint he and his father had coated many times. The same little chips of paint he would pull when nervously talking to the neighbor’s daughter, whom he thought was his first love. He felt his nose fill up with the tropical smells of his childhood.

He noticed some drops of rain and had to close the window. A cup of warm coffee started his day, and he went to his new routine of news on the internet, driving around by himself to find a place for a light lunch, and coming back for a movie at the dollar theater or on the computer. Each morning, his coffee machine did not have to wake him up as he was itching to enjoy his view again.

He opened the window expecting the Jambo tree and was greeted by the great sandy plains of the South Atlantic. It was the green beach of his childhood, the one that had given his state the nickname of Land of Green Seas. The air smelled of the salty breeze, and he felt tiny grains of sand hitting his face. He saw the long fingers of the breakers holding the tides to protect the city; he saw in the distance a few jangadas, the small rafts used by local fishermen to bring home their oceanic harvest. Along the shoreline, there was a runners’ lane where people would do their morning walking and running, and along with it the many sellers of local arts and crafts. He took a deep breath and shut his window, sitting down in front of the computer for his morning coffee and news.

Next morning, he walked slowly to the window. He put his hand on the small latch and stayed there longer than expected. Instead of opening the window, he moved his fingers between the louvers and peeked outside. A small house loomed on the other side of a cobblestone street. The sidewalk in front of it was a couple of feet higher than the street level. He remembered his grandfather telling him that the high street level was done to prevent the floodwaters to get in the house. That house across the street was a bar, and he remembered peeking through the window at night from his grandparents’ house in Jaguarana, the city where his mother would take him and his sister on vacation every year. His grandmother did some peeking too; to see who was going to the bar, how long were they staying, and who was taking them home. He had his own reasons to look since the daughter of the bar’s owner, a girl with long black hair, would come help her father from time to time. That was their own version of beer commercials pairing women and alcohol; and to him the best.

On a cloudy Little Rock morning, he opened the window to the dark corners of a motel room. What he saw lying in bed, getting undressed nervously, was a young, blond, and pale eighteen-year-old girl he recognized immediately as his wife of forty years. He saw the quiet pride in the young body; he saw the white skin punctured by birthmarks, and he saw her long hair running all the way to the small of her back. She would never have it as long as that again. At first, they would do just that, lie in bed in awe of each other’s body, youth, and inexperience, afraid that if they had sex suddenly everything would change in some unpredictable way. Eventually they did and it did. They did not know at the time they would be spending the rest of their lives together and witnessing the inclement effects of time on their firm muscles and smooth foreheads. He did not know at the time she would only get more beautiful to him, after each child, after each year.

And so, his final routine was born. It would not last long but it lasted enough. One day the nursing home cleaning crew found him dead, sitting in the recliner by the window. His children came again from the coasts to Arkansas. His son talked to the nursing home manager, and she told him about a placid pattern and a peaceful death. She told him he had a smile on his face when they found him looking out the window. His son imagined she told that to all grieving children, and he chose to believe it.

The son remembered his sister insisting on the room with a view, and for some reason he could not discern, he remembered one of the many sayings from his father, “The best part of doing a good job is to be able to look at it when you are finished.”

His son looked at the view outside and saw Pinnacle Mountain looming in the background. He was reminded of how beautiful the Natural State was, and then he shut the window.


Erick Messias, M.D., Ph.D., M.P.H., is a professor of psychiatry and the Associate Dean for Faculty Affairs in the UAMS College of Medicine.

Filed Under: all, Fiction

Puffy Girl Problems

By Morgan Sweere Treece

Lights, bright lights, blurry lights, headlights, flashing ambulance lights, EMT flashlights, fluorescent hospital lights.  That’s probably some of the only things I actually can recall about that night. 

November 15, 2012.  It was a chilly- Thursday evening, the time of year when all the leaves have just fallen on the straw-like, browned grass as everyone got ready to pull out their fuzzy, woolen scarves.  It seemed no different from any other day.  I got up, went to school, and everything was normal, but when I got home, I heard some of my favorite words leave my mom’s mouth:  “Your cousins from Fayetteville are in town and headed out to the field.”

My cousins were on break from college and home for the week, which only happened once in a blue moon, so it was breaking news to me.  What did this rare occurrence mean?  Extreme night mudding!!!  In mere seconds, I was hopped up in my car and sped on my way to the field where we took the four-wheelers each time the cousins visited town.  It could actually be compared more so to an obstacle course instead of a field, the way the trees seemed to randomly spurt from the rocky soil, with large boulders and hills lining the moist, unmowed fields.   We unloaded the four-wheelers, still caked with the old, dried mud and dust from our last night riding event.  Soon enough, the four of us were zipping over the rough and ragged barriers toward our designated finish line, the dry, brown grass grazing our ankles and bitter wind gnawing at our cheeks.

One of the next memories I have of that momentous evening is waking up, confused, hardly able to see, completely unable to think, having been rolled over by multiple unidentified hands.  In what seemed like hours but was actually seconds after that, I tried to understand what was happening.  I recognized no one around me.  My mind was unable to comprehend the situation.  I heard shouting – in the back of my mind, a perceived distant memory.  “HELP! She’s gushing blood! Morgan!”

EMTs, paramedics, cousins, and my mom all screeched in unnerving anxiety.  They attempted to relieve the aching pain from my marred face and body soaked with blood.  I couldn’t feel the pain, but I knew it was there somehow.  I couldn’t talk.  I couldn’t breathe.  I couldn’t think.  I had a major migraine, and I could feel the blood gushing from every pore of my body. The blood started to dry and left an amber stain all over due to the icy, winter-like wind.  I had no idea what was happening and no one took the time to explain it to me.  I could see the wrathful, crimson liquid encompass my line of vision in mere seconds, and I even tasted the iron soaking onto my tongue.  Then all went black.  The next  14 hours of reconstructive surgery and stabilization were marked by panic, worry, and God’s miraculous and loving work in the hands of many talented ER physicians and surgeons.  The surgery was tedious but flawlessly executed in the end, soothing news to my anxiety-filled family in the waiting room. Then I woke up in a haze.  The intense, sterile hospital room filled with dazzling yet blinding fluorescent lights and vaguely familiar faces left me still unable to comprehend just where I was, or even who I was.  The copious amounts of drugs in my body overtook my mind, and the five or so hours after I awoke are mostly a fuzzy cloud in my memory. 

Several days later,  I regained complete alertness in a sober stage.  Someone told me that in my first spoken words after I roused from my morphine-induced slumber of surgery,  I solicited the immediate presence of my mother, my boyfriend, and (of course) a Sonic Coke — clearly, my priorities were pretty straight, even as a pain-filled, drugged-up girl.  I knew exactly what I wanted and what I would have missed greatly had I never seen those three things again, however trivial the last is.

When I woke up in the hospital, the first things I actually remember were large dark black and blurry mounds blocking my view. I blamed this “tunnel vision” on my cheeks, which had risen to an insanely abnormal height due to the entire swelling of my face.  However, now that I think about it, symbolically this block in my sight may have actually been my seeing less temporarily so that I could see more now. It took a full six months – in what I called “puffy girl problems” — for all the inflammation to disappear, but after I finished being the metal chipmunk I was and began to regain the feeling in my face, I realized how important the experience was to me, however awful it may have been.  And trust me, it was really awful. 

The best and worst experiences in the lives of people always are the most important memories because those experiences have the greatest impact on the way people live.  I consider my ordeal as one of the most painful and grueling experiences I have had at this point in life in which I utterly shattered every possible bone in my face into 23  pieces, including both jaws. I had a concussion and practically became  The Terminator with four metal plates, four metal screws, and large wires and bands that starved me for six weeks.  However, I also realized it was one of the greatest blessings in my life because I realized the most important things in life: God, family, and friendships. 

Even almost a year later, I often get caught up in the almost regular conversations imbued by questions like  “You broke your…face?” “Did it hurt?” “Can I touch it?” “You have metal in there?” “What happens when you go through airport security?  Or if you get slapped?”, and the most frequently asked “Do you look different?” or “Do you like it?”.  Those last words of blunt inquiry always seem to strike me the most, although it is only a frivolous query to others.  The words seem to stop me dead in my tracks more than anything anyone could probably ever say to me.  I’m not different.  I’m still me. I still feel like me.  But I’m not.  I didn’t recognize myself in the mirror for almost three months after my surgery because, in truth, I am different—on the outside and the inside.  Not only did my (still groundbreaking) stupidity and recklessness teach me not to attempt ramping a rocky slope at top speed on a four-wheeler in the autumn dusk just because I was dared to. It taught me that I — as much as I want to be or think  I feel most of the time — am not Superman.  I shouldn’t live like it either.  I can’t live like I’m invincible, like I can’t get hurt or suffer any consequences,  like I’ll never get caught doing anything bad, like I can save everyone around me from everything.  It might be an insane joyride in the air to feel so immortal, flying even, but the ending sensation of cracks, snaps, stinging, and wearing a warm, sticky sweater of blood sure did crack my thick skull as I hit the ground. I learned from all that.

However cliché it may be to say that the lesson learned here is carpe diem or “YOLO,” it isn’t to say the latter — that life isn’t forever and high school isn’t forever, and I learned to no longer treat  those periods  in such a way.Waking up in that uncomfortable hospital bed, drugged to the maximum, looking like a hideous, Botox-gone-wrong patient, but still surrounded by tons of worried, sympathetic faces of loved ones, along with various flowers, cards, and balloons — I couldn’t help but feel more like a puffy little princess than the monstrosity of a monster I was then portraying.  It’s a medical miracle that doctors were able to piece back together a face shattered into 23 pieces, much less to make the end result even remotely close to my previous appearance. I learned to appreciate the precious jewel of life and the irreplaceable emeralds and rubies surrounding my bed 24/7 until I was released to go home.   The sudden realization I had when awaking in that bed was not the normal “blinding light” that near-death experiences often produce, but more a recognition that I should live my life better, treat those I love better, and graciously accept the undeniable truth that I hadn’t been doing these things before.  I often ponder, what if I hadn’t woken up?  Had I told my family I loved them before I left?  Did my friends know how much I cared about them?  Was there any doubt that I lived my life for God?  Was my time here enough?  I never wanted to question it again.  I was never afraid of dying — just, not really living, and not living the right way.  I truly believe that God has given me another chance in this life; it just took a quick slap in the face (or rather face plant) to realize how I’m supposed to be living my life.  Don’t wait to get your life together or to go for your goals — the time is now.  My accident was a traumatic eye-opener, although it seemed quite the opposite through the six months of swelling.  The genuine truth about “forever” is that it is happening in the present moment.  Living life to the fullest is an understatement for me; I want to live life to the point of super saturation, to the point that I am constantly overflowing with unparalleled joy, kindness, and gratitude.  It is vitally important to live each day as if it is my last, to tell people I love them while I have the chance, and to never leave any words unspoken.  The lesson learned?  Don’t put off things until tomorrow.  What if there is no tomorrow?  Live your dreams now.  Say it.  Do it.  Appreciate it.  Love it.   And whatever you do:  Always, always, ALWAYS wear a helmet. 


Morgan Sweere Treece is an M.D./Ph.D. student at UAMS.

Filed Under: all, Non-fiction

George Macready and the Art of Family Medicine Publications

By Diane Jarrett

The doctor stood there looking at his patient with a mixture of compassion and horror. Somehow the treatment had gone terribly, terribly wrong. What had started promisingly as innovative therapy for serious trauma wounds had resulted in the ultimate unexpected side effect.

The patient had been transformed into an alligator.

Perhaps this is not surprising when you consider that the doctor had been possessed by the devil only 15 years earlier. Oh wait — that turned out to be just a bad dream.

In a way, all of the above is a dream, from the standpoint that I’m talking about movies and movie doctors as portrayed by George Macready (1899-1973), a stage, film, and TV actor whose career spanned from the 1920s to the 1970s. Along the way, he was a devil-possessed doctor in The Soul of a Monster (1944) and an awesomely inept endocrinologist in The Alligator People (1959). And you thought you had personal problems, or had treated patients with scaly skin.

What has this to do with Family Medicine, you ask?

My department chair said basically the same thing when I told him that I had just obtained my first national publication, not in Family Medicine or JAMA or any of the journals that classic film fans would call the “usual suspects.” No, my debut was in Films of the Golden Age, and my topic was the life and career of George Macready. Macready is best remembered for playing Rita Hayworth’s husband in Gilda (1946), director Stanley Kubrick’s cruel World War I general in Paths of Glory (1957), and the patriarch Martin Peyton in TV’s Peyton Place from 1965 to 1968.

My Macready publication, which in part addressed his roles as physicians in The Soul of a Monster and The Alligator People, did nothing for my career advancement. Publishing essays about the lives of classic movie actors doesn’t count in Family Medicine for promotion, tenure, or anything like that. Zilch. No matter how often I’m published in film history journals (eight times so far) or how much acclaim I receive in film circles, the medical profession is unimpressed.

I’m not a medical doctor. I don’t even play one on TV. My background is in education and journalism, and I came to work in a Family Medicine residency program long after I diagnosed myself as having a serious case of classic film passion. Writing about clinical topics or education related to clinical topics had never been on my radar. It is now, but my heart continues to compel me to spend some of those hours that might be assigned to Annals of Family Medicine on researching the lives of actors such as William Boyd, Richard Todd, and others.

Being a published film historian is not the same as being published in a medical journal, of course. Still, I can’t help but wish that my avocation could be considered as support for my vocation. Thus far it seems unlikely, though I point out George Macready’s medical “connections” at every possibility.

Since promotion is important to me, I’ll continue to faithfully seek opportunities to publish in periodicals that count, kind of like the group called Dr. Hook & the Medicine Show, in their song from the 1970s, sought to be on the cover of Rolling Stone magazine. (That wouldn’t have counted for them either, despite having the word “medicine” in the name of their band.) In my personal time, though, I’ll continue to research and write about actors who might have at least played doctors in the movies.

Meanwhile, I can only hope that someone comes up with a peer-reviewed journal entitled Medicine in the Movies.


Diane Jarrett, Ed.D., is the Director of Education and Communications in the Department of Family and Preventive Medicine. She is also the Assistant Director of the residency program. 

Filed Under: all, Non-fiction

Conversations @ UAMS: Dr. Matt Quick

By Jace C. Bradshaw and M. Paige Plumley

Dr. Quick is an Associate Professor of Pathology with a clinical interest in gynecologic and obstetric pathology. He completed his residency at UAMS in anatomic and clinical pathology. He followed his residency training with a fellowship in surgical pathology at UAMS and a women’s and perinatal fellowship at Harvard Medical School/Brigham and Women’s Hospital.

Tell me about yourself.

Dr. Matt Quick

Well, okay. I think we think about ourselves in a bunch of different ways, so I always try to remind myself of how I self-identify. The first thing I self-identify as is a parent and a husband. I always make sure that is my priority. Then, I am a pathologist and a teacher. I think it is really easy to get swallowed whole into medicine. I think about that and try to make sure that I am a very attentive parent and husband because that’s been the most important thing in my life. I like to have fun, but my job is pretty serious because I deal with cancer a lot. So everywhere else, I like to turn the seriousness off, which can be annoying to my wife from time to time, and I understand that. And she gets it. I like being outdoors; I love playing video games and watching TV. I like to fish, but I don’t get to do it much anymore because that requires stillness and quiet: two things that don’t exist in my life. But that’s okay. I feel like I am a very lucky person. I found my way into this job—it was a total accident. I like that I get to help people even if they are unaware that I am helping them. And I enjoy my life outside of work as well.

As a pathologist, which type of cell or tissue would you be, and why?

Well tissue-wise, you could be like the liver and deal with toxic things and make them un-toxic. So, I guess I would be a hepatocyte because you would be good at dealing with problems and issues and detoxifying situations. I feel like that’s a big part of my job, not on the clinical side, but on the medical student side with all the advising I get to do. One of the pleasures that I have is talking to people who bring their problems in, and I can help find a way to take care of those problems.

How did you get here? How did you decide you wanted to be a doctor?

I don’t know. My parents will tell you that I said I wanted to be a doctor early on. I guess I always had this fascination with the body and how it worked. I also spent a fair amount of time in emergency departments and doctors’ offices, so it must have imprinted on me. Then in college, I got really interested in how we get sick. I thought microbiology was awesome. I thought it was so fascinating that a single-celled organism can take down a human. In college, that was the closest I could get to studying to disease, so I was a microbiology major who wanted to do infectious disease as I was applying to medical school. Then [in medical school], I got a C in internal medicine, which was my first C. That was a moment in my life. But what I realized was that I got that grade because I wasn’t that fired up about the subject. And someone told me, “If you want to do infectious disease, that’s a branch of internal medicine, so you should probably be good at it. And you’re not.” So they asked me if I was more interested in the lab side of medicine–more like pathology. And I said no. But I started spending more time in the microbiology lab, which led to me spending time in the surgical pathology lab, which led to me falling in love with surgical pathology, which is now what I do for a living. I don’t do any microbiology anymore. I still think it is cool, but I like it more as a friend.

Now how did I end up as a gynecological pathologist? That’s another story of failure. I wanted a heme-path fellowship and did not get it. It’s a recurring theme in my life that I think I want to do something, and the universe is like “You don’t want that.” I went to my mentor, and he said, “That’s okay. You are better at other stuff like gynecological pathology anyways.” He urged me to apply for a fellowship, and something about the field just clicked. There are so many different tumors and so much variability. Everything works together. I just found it so interesting, and it makes the job interesting. I look back, and I am happy that all of those things happened. It helps me when I am advising students because I had no clue, so I can help the students who also have no clue.

What are your strengths and weakness as a physician?

Let’s start with strengths, so I can think about weaknesses in the background. One of the strengths that a good pathologist exhibits is a very sharp eye for detail. You have to have a very inquisitive mind. I can notice small changes easily, and it drives my wife nuts. It’s the little things that set my brain off. Then wanting to know why or how helps. When you look at 50-60 cases a day, it’s easy to fall for the trap of just doing rapid-fire work without thinking. If you do that, you will miss things and miss diagnoses. So, a good pathologist has to always maintain a high level of vigilance and attention to detail. Those are my strengths due to my interest in surgical pathology. I know that applies to most specialties. I do it with glass slides; they do it with people.

My wife is going to be like, “You don’t ever talk this frankly with me.” The weakness I have is to not take things too seriously as a defense mechanism. I often think that if I don’t take things too seriously, then they aren’t too serious. Yet, what I deal with daily is incredibly serious. So those two states are constantly in conflict with each other. It makes it difficult to look at a case of a 25-year-old that you are going to diagnose cancer in and go home and feel okay at the end of the day. So, I have to have this defense mechanism, but I feel like it detaches me from my patients. And that’s tough. This way, I can go home and be a normal parent and husband, and I avoid being depressed, reserved, and quiet. So, the inability to directly deal with strife is one of my biggest weaknesses. It’s something I struggle with every day. I feel like this is one reason why a lot of doctors have problems outside of work.

What type of cookie would you be, and why?

How do you answer a question like that? I would be a chocolate chip cookie without nuts because that’s the kind that my kids like the most. When I think about cookies, I think about them being happy. It would be nice to make them happy in cookie form. If I had nuts or pecans or something like that, they would hate me, and I’m hypoallergenic this way.

Filed Under: all, Conversations at UAMS

The Shoes Have Eyes

By Erin Yancey

“I like your shoes!” I said to the teenage girl standing against the wall of the elevator as I stepped in.  I had just begun my third-year psychiatry rotation, and I was arriving for my first day of clinic with a child and adolescent psychiatrist, Dr. Wilson.  I was particularly interested in both psychiatry and pediatrics, so I had been looking forward to this day for a while.  The girl ignored my compliment and continued to stare down at her bright pink Converse sneakers, complete with multicolored laces and a hand-drawn eye on each shoe.  Her mother standing next to her asked, “What do you say? Can you tell her thank you?”  The girl continued staring down, and her mother smiled a soft apology towards me.  The elevator arrived at our floor, and I tried to smile at the girl one last time to no avail.  Her gaze was fixed upon the sharpie-eyes on her shoes as if in a staring contest.  

Once in Dr. Wilson’s office, I watched him speak with patients with a variety of needs.  Although I had performed psychiatric evaluations with adults on my own, he suggested I shadow him for the first few patients of the day to see how interviewing children differed.  The first several patients were being seen for follow-up for their anxiety, depression, and ADHD.  I noted the close patient-physician relationship—all patients and their families spoke with Dr. Wilson comfortably and honestly, and it was clear they saw him as someone they could deeply trust.  Dr. Wilson quickly briefed me on the next patient to be seen as he had done all morning.  Her name was Sarah, and he explained that she rarely, if ever, spoke during her visits and had a history of severe depression.  She had a difficult past and lived with her adoptive parents.  Earlier in the week she attempted suicide by wrapping a shoe lace around her neck, the subject of her visit today.  The nurse brought Sarah in, and I immediately recognized the bright pink sneakers from the elevator.  She sat down in the chair across from the doctor’s desk and planted her heels firmly on the seat so she could rest her head against her knees. She was around fourteen years old with wild, red hair, and in her right hand she tightly clutched a cell phone and pair of earbuds.  Again, she began staring down at the eyes on her shoes.  Her parents sat on the couch near the back of the room, and Dr. Wilson began the session.

Depressed teen girl with red hair looks down

When Dr. Wilson said that this patient would rarely speak, he was not exaggerating.  He spent a few minutes asking her about what had happened, but each question hung in the air unanswered.  Eventually, he directed his questions to her parents.  They seemed concerned, and equally defeated, as they told him that she would not speak with them about it either.  As they spoke, Sarah remained silent, staring at her shoes and methodically winding her earphones around her fingers and palms.  Dr. Wilson expressed his concern about her unwillingness to speak to him and offered to find a psychiatrist that the girl felt more comfortable opening up to.  Her parents assured him her behavior today was not unusual; she had a long history of selective mutism in the presence of medical professionals.  During visits to her primary care physician and even during a recent urgent-care visit for a sprained ankle, she refused to speak to any doctors or nurses.  He sat quietly for a moment, thinking.  His expression suddenly changed as he stood up and said, “Mom, Dad; let’s take a walk.”  Before they left, he said to me, “You’re going to do this.”  I was caught off guard, but I nodded, grateful that he was allowing me to conduct the interview.  I felt nervous, too, because I knew my prospects of making a breakthrough with the girl were dim.  All but me and Sarah left the room, and I walked around the desk to sit in the doctor’s armchair.  

I watched her as she wrapped her headphones around her hands again and again while staring at her shoes.  I attempted to revisit Dr. Wilson’s earlier questions with her.  “Can you tell me what happened this week?” Silence.  “Why did you have to go to the hospital?” … “How have you been feeling lately?  What is your mood like?” More silence.  I began to feel discouraged and acutely aware of how long the others had been gone.  I tried one last time, with a slightly different approach.  “I know you don’t want to talk.  And I know it’s kind of scary being in a doctor’s office.  But actually, I’m not a doctor yet!  I’m still in school, just like you.  If you tell me what happened, it will help us come up with a plan to help you feel better.  Can you tell me?  Did something happen this week?”  As her gaze stayed fixed upon her shoes, she nodded her head.  

The movement was so slight, I almost didn’t notice, but she had nodded her head and finally answered one of my questions.  Suddenly, my hope was renewed that we may be able to communicate after all.  Careful to only ask yes or no questions, I asked about her family, her home, and her school.  She nodded and shook her head appropriately, and all the while furiously wound her earphones around her fingers, around her hands, around her knees.  I then asked about her friends.  She froze.  With the cessation of her movements, I noticed the faint horizontal scars on her wrists.  I was surprised that I had not noticed them sooner, and then wondered if the systematic winding of her headphones was not absent-minded fidgeting, but perhaps a very intentional distraction.  I delved a little deeper and eventually learned that her best friend, her only friend, had moved to another state this week.  Her eyes, still fixed on her Converse, began to well up with tears.  One escaped and traced an uneven river down her face.  She did not move to brush it away.  For a moment, she and I both stared at the eyes on her shoes in silence.  Her multicolored laces were covered in stars, and I briefly wondered if those were the laces she had turned to in a moment of despair.  My stomach turned, and I felt tears spring into the back of my own eyes as I imagined how she must have felt.  In that moment I realized that I will never know all the details of her past, or the depths her depression brings her to.  I can try to understand, but I never truly will.  We carried on, and although her tears would occasionally be too many to be contained by the brim of her eyes, no sound ever escaped from her.  Not a sob, not a sniffle, nothing.  It was as though she was purposefully refusing to make a sound. 

Our communication rested on a delicate balance of safety and trust, and a knock on the office door disrupted the scale and signified that our interview had come to an end.  I spoke with Dr. Wilson in the hallway about the information I had gleaned from our near one-sided conversation.  We reentered the room, and he and Sarah’s parents spent the rest of the appointment discussing his treatment recommendations.  As they talked, Sarah and I sat next to each other near the doctor’s desk.  The appointment ended, and I said goodbye to her and watched her unique pink sneakers pace silently out of the room.  Dr. Wilson shared his optimism at the small breakthrough we had seemed to make.  I, however, felt disheartened as the young girl left, knowing that her illness was severe and her struggle with depression would likely be a lifelong battle.  As if sensing my deflation, he said with a smile and a shrug, “Progress is progress.” 

As my psychiatry rotation moved forward, I interviewed a diverse cohort of patients with a variety of psychiatric issues including depression, panic disorder, PTSD and schizophrenia, among others.  It wasn’t long until I realized that no specialty fascinated me more than psychiatry and its patient population, and I decided what my path in medicine would be.  Often, I reflected on my interview with Sarah; the girl with the eyes on her pink sneakers.  I interviewed plenty of patients who were somewhat difficult to communicate with, whether they were reluctant to discuss certain aspects of their history, or they were psychotic and required extra patience to complete a psychiatric evaluation, but she remained the only patient I ever interviewed who refused to speak at all.  But even without words she had taught me an incredibly important lesson for my future career as a psychiatrist: progress is progress.  

I also still had a lot of questions about Sarah.  I wondered what her personality would be like if things were going better.  I wondered whether her motivation to remain silent came from a place of fear or apathy.  The more I contemplated, the more I remembered the vibrancy of her shoes.  They almost didn’t match their wearer.  But expression is not something that necessarily requires being verbal.  Perhaps this girl who was in a very dark place mentally found it easier to express her personality through her choice of foot-wear.  She never made eye contact with me throughout our interview, but maybe the eyes drawn on her shoes conveyed that even though her mouth was tightly shut, her eyes were wide open.  Or maybe she spent so much time staring at the shoe’s eyes because they made her feel seen.  This is all of course speculation, but expression is variable, and oftentimes truly understanding our patients requires paying attention to even the smallest of details. 

Towards the end of my rotation, I spent a week at the State Hospital, where the most ill and indigent psychiatric patients in the state received inpatient care.  Some patients at this facility had criminal charges against them.  On my first day there, I went to the court yard to meet and interview one such patient.  As I approached him, I tried to think of what I could say to make a connection with him; to start things off on the right foot.  He was sitting in a chair alone, wearing a black hoodie, sweatpants, and a pair of vibrant blue tennis shoes.  He looked up at me, and as he did, I smiled warmly and said, “I like your shoes!” 


Fictional names were used to preserve confidentiality.


Erin Yancey is a fourth-year medical student in the UAMS College of Medicine.

Filed Under: all, Non-fiction

The Revolution in Neuroscience

by Edgar Garcia-Rill

The Aim of Science

 Numerous journals published each month contain hundreds of articles addressing diseases, clinical care, and novel therapies.  This is a persuasive argument in favor of brain research.  Not only do neuroscientists feel overwhelmed by the proliferation of their own literature, but also the sheer number of “breakthroughs” published adds to the inordinate weight of the competition.  We should remember that most theories are actually proven wrong, and that is “business as usual” in science.  Considerable patience is needed to ensure that “breakthroughs” are properly replicated, validated, and accepted.  

Science, after all, is the search for better answers, not absolute truth.  The aim of science is to achieve better and better explanations.  Sir Karl Popper proposed that a hypothesis can only be empirically tested but never proven absolutely true, and that it can only be advanced after it has been tested 1.  It is perfectly acceptable for a scientist to be wrong, as long as he is honestly wrong — that is, as long as he or she designed and performed the experiment honestly.  
              
Popper also advanced the concept of falsifiability.  The honest scientist should apply this concept to his or her own theories and research findings.  He or she should be the best critic by probing weaknesses so that, by surviving withering criticism from the one scientist with the greatest familiarity with the experiment, the hypothesis can come closer to the truth.  However, few scientists actually throw down the gauntlet in such research.  Many of them defend their work with desperation and viciously criticize opposing theories.Some even censor the work of opponents by rejecting their manuscripts or grant applications.  Logic would demand that a scientist should strive to prove his or her own work false before someone else does, but that feat is difficult to accomplish during the typical 3- to 5-year period of a grant award.  In other words, the funding granted for an idea requires supporting evidence and success so that a grant may be renewed for another similar span of time.  Few “breakthroughs” can be proven correct (or incorrect) in such a short period; thus, the argument goes, more studies and a further funding period- are often needed.  Review committees face the task of precluding applicants from overselling their work.  Generally, reviewers actually agree on the quality of an application; however, they tend to shred weak applications although in some instances unworthy grants get funded anyway.   Conversely, due to the shortage of funds, many worthy projects instead go unfunded.

Sometimes, a novel technique has excellent “wow” value and yet can hide weaknesses.  These flaws may take time to be exposed, especially when reviewers jumping on the bandwagon defend it out of self-service.  Some “exciting” methods can be adopted wholesale by an entire field without due consideration for proper controls.  On occasion, the individual is so well respected that mere reputation can hide minor weaknesses.  There is also the “halo” effect from being in a top 20 medical school, an effect that can provide enough of a nudge to get an award funded, although it may not be better than one dredged from the backwaters of science.  The question is this: will any of those awards lead to a major breakthrough, a new cure, or a novel effective therapy?  The answer is that we do not know.  But we do know that only a very few will provide a significant advance, but if we do not fund the research, we relegate our lives to the status quo with no options for the future.

So how can we determine which science to fund?  How can we be certain which discovery is closer to the truth?  How can we identify the finding that will lead to a new cure?  We can design ways to do all these things better, but never with absolute certainty.  A good starting point is the realization that we can be “snowed,” at least for a while.

Famous Neuroscientific Theories

The “Blank Slate” theory proposed by thinkers from Aristotle to St. Thomas Aquinas to Locke suggested that everything we know comes from experience and education, nothing from instinct or natural predisposition.  Many of the proponents can be forgiven for advancing a “nurture or bust” philosophy since genetics was not in their lexicon.  That is, they had incomplete knowledge.  An avalanche of data has shown that many traits are inherited along with many instincts, the “nature” argument, we know, is not exclusive of nurture.

At the beginning of the 19th century, “Phrenology” proposed that specific traits could be localized to distinct regions of the skull overlying the brain, creating detailed cortical maps of these functions.  These advocates exceeded the available data, and in many cases used the process for ulterior motives, including racism, to spread their influence.  By the 20th century, such pinpoint assignations of skull regions had been discredited.

Another fallacy is that people “use only 10% of their brains,” an assumption deriving from a misunderstanding of studies of sensory-evoked responses in which “primary” afferent input (e.g. vision, touch, hearing) only activates a small percentage of the cortex.  This result sidesteps the fact that most cortex is devoted to association functions that process such information both serially and in parallel.  Embedded in this conclusion is the fact that neurons need to fire, otherwise their influence on their targets is lessened.  Without reinforcement, synapses weaken, almost as if the input was “forgotten.”  “Use it or lose it” is the principle of brain activity.  What this means is that our brain is continuously active — all of it.

Contrary to what many researchers espouse, many drugs shown to be efficacious in animals manifest limited effectiveness in humans 2.  In fact, the sensitivity for most drugs tested on animals has the probability of only a coin toss (~50%) that it will be effective in man.  Unfortunately, the opposite can also be true.  Thalidomide, a drug tested in more than 10 species, hardly ever produced birth defects, except in humans 3.  

Many of these theories were not disproven because of scientific fraud or faulty experiments.  Most were the result of incomplete knowledge, which includes the common problems of study size, limited technology, etc.  We maintain that it is also the inadequate application of falsifiability by the proponents that might have prevented some of these spectacular failures.  This incomplete knowledge and inadequate application of falsifiability point to neuroscientists. They are responsible for these failures, failures which could have been avoided if falsifiability had been practiced.

Mixed in with such famous failures are a number of sophisticated and stunning discoveries about the brain.  At the turn of the 20th  century, Ramon y Cajal observed that the nervous system consists of individual cells, not a continuous network as was the thought of the time.  In contrast, Cajal’s work led to the description of the synapse and chemical transmission across the narrow clefts between neurons.  This description then led to the identification of a myriad of transmitters — some of which could alter behavior –, followed by the development of psychoactive drugs that modulate mood, movement, and other functions.  Pharmacological intervention soon allowed many patients to live outside an institution, eliminating the need for padded rooms and “lunatic asylums.”  

About  30 years ago, it was thought that humans were born with all the cells we will ever have.  It now seems that we lose cells throughout puberty, but we find the occurrence of neurogenesis in the adult.  The creation of new brain cells, a totally foreign concept until recently, is now accepted wisdom. How to control such generation is the topic of study in a number of neurodegenerative disorders.

In science, certain simple conclusions can have an unintended impact.  The conclusion that Benjamin Libet reached in his 1980s experiments on the Readiness Potential is one example.  Because the “will” to perform a movement appeared to occur before the actual movement, and the person was not aware of this intention, Libet concluded that we perform movements through “unconscious” processes.  Unfortunately, this conclusion led to another conclusion – a disturbing one —  that our subconscious was responsible for our voluntary actions.  By extension, this meant that there was no free will.  The implications for personal responsibility carried unwanted effects, including advancing legal arguments absolving miscreants of culpability.  However, in Libet’s work, the person studied was fully conscious, not unconscious.  Moreover, while awake, we are aware of our environment as we navigate it, even though we do not expressly attend to any particular event.  In other words, we are aware of cars and pedestrians as we carry out a conversation by often moving to avoid collisions.  In fact, we are “pre-consciously” aware of the world around us and respond appropriately, although we do not attend to a particular event.  This interpretation makes it clear that we are indeed responsible for our actions, for our voluntary movements.  However, it is also clear that the perception of that environment, whether pre-consciously or consciously, is altered in mental disease.  Disorders like psychosis can dramatically alter these perceptions and thus guide our actions without responsibility.  That is why proper diagnosis of mental disease is essential.

It is inarguable that brain research has led to remarkable improvements in health and quality of life.  The rather modest investment in funding targeting the brain has paid off exponentially.  While the National Institutes of Health are funded to the tune of ~$40 billion yearly for research from cancer to heart to brain, spending for defense research is more than 10 times greater.  While scientific review committees discuss, dissect, and agonize over a $1 million grant application for almost one hour, Congress makes billion-dollar defense funding decisions in minutes.  We should realize that the successes in brain research will far outweigh the failures, but we should also know that only some of those successes will result in a novel treatment.  

In addition, the annual recurring costs of most brain diseases in terms of medical costs, lost income, and care is in the billions of dollars.  One novel treatment for a disease that was derived from a typical $5-10 million research program will save billions of dollars every year.  We know that for every dollar spent on research, we stand to save thousands every year, and conversely, for every dollar we do not spend on research, we stand to pay thousands every year from now on.  

Famous Techniques

One of the most appealing techniques in medicine is magnetic resonance imaging (MRI), which employs strong magnetic fields stimulated by radio waves to produce field gradients that are then computed into images of the brain.  Functional MRI (fMRI) uses blood oxygenation levels to compute images that are assumed to reflect neural activity.  The standard black-and-white displays allow the clinician to detect and measure tumors, infarcts, and even infection, as well as bone, fat, and blood.  This technique has been a life-saver for a number of disorders in which clear, detailed, and accurate anatomical images are required.  With the advent of more sophisticated MRI computation and fMRI, the displays have become color coded, so that changes in blood oxygenation are displayed in beautiful false color images.   This characteristic allows proponents of the technique to oversell their product.

Today, fMRI is being used in research to make unwarranted conclusions about the workings of the brain on a real-time basis.  Researchers undertake studies from voluntary movement to sensory perception to the performance of complex tasks. Some labs have “concluded” that they can detect truth-telling from lying, and have pitched for fMRI as a lie detector.  The issues related to the technology will not be repeated here, merely to emphasize that the field has been remiss in standardizing the generation of images.  This has created a field in which the same experiment carried out by different labs produces different images and, therefore, conclusions.  The method suffers from a complexity requiring recurring individual decisions regarding the weights of factors which are applied differentially by researchers at multiple stages in the processing of that image.  It is incumbent on researchers in the field to develop standardized methodology.  Perhaps the most serious problem is that the technique actually measures blood flow, not neural activity.  The pretty images represent the aftermath of brain processes that included both excitation and inhibition.  The fMRI is essentially a static image of an ongoing complex event, much like taking a picture of an orchestra and, from the frozen positions of the players, making conclusions about the identity of the musical piece being played.

Granted this illustration may be an exaggeration, but the fact remains that the mesmerizing effect of the images hide the fact that they are based on moving processes founded on assumptions about how the brain works.  Moreover, overselling of the technology has accumulated an undeserved portion of the funding pie.  Many have naively moved to the technology without developing testable hypotheses and controllable experiments.  The monies for the BRAIN Initiative, a monopoly of funding for the method, has been hijacked to the detriment of other valuable technologies.  It is hoped that the limitations of the method will be exposed so that those using more esoteric variables can better justify their decisions.  Currently, the value of the technology to the clinical enterprise is without question, but when complex neural processes are studied using what is a measure of blood flow, the conclusions drawn can indeed be questioned.

One policy issue that emerges is the following: what is the harm in throwing money at the problem?  Why not overfund a research area until all the problems are worked out?  The answer – these practices are unrealistic.  A similar situation arose when agencies began pouring money into AIDS research.  Funding levels that historically had funded between 5% and 15% of submitted AIDS grant applications- rose to allowable funding of 20% to 25% of applicants.  The pressure increased to make breakthroughs, and “discoveries” came hard and fast, with seemingly rapid progress towards systematically resolving the problems of a complicated infectious process.  Responsible labs were soon confronted with the realization of improperly controlled “discoveries.” These labs began spending resources and time on validating questionable results and unsupported theories.  Some were forced to attempt to replicate many such findings in order to move the field ahead, if at all. This consequence led to overfunding from which the field suffered. 

Another technique with which the public at large, including attorneys, is enamored is genetics.  This powerful array of methods has exceptional promise.  As future clinical tools, personalized medicine stands to provide answers to a host of medical questions and may even give us some cures.  But there is the issue of genes and determinism.  That is, genes are not deterministic, but very malleable, likely to produce different proteins under slight changes in condition.  In addition, genes are co-dependent, such that the expression of some genes is not only dependent on other nearby genes (in terms of chromosome location), but also on some distant genes. 

The field of genetics promises to address the links between genes, the brain, behavior, and neurological and psychiatric diseases.  Therefore, neurogenetics holds great promise for the future of clinical science, but it also has created a gap.  This promise has attracted the bulk of funding for genetic studies, pushing the testing of treatments and cures  —  that is, translational research —  further into the future, resulting in a gap between patients who need to be treated now and those who may be successfully treated with a genetic intervention in 20 or 30 years.  

Research grants have migrated away from clinical studies towards molecular studies.  Because of the complexity of the genome, short-term answers are unlikely.  Premature genetic interventions could be catastrophic, but the power of the technology has moved funding away from interventions in the clinic.  Translational neuroscience is designed to bring basic science findings promptly to the clinic 4.  It is a response to an Institute of Medicine report from 2003, calling for more emphasis in this area 5.  The reason for the concern voiced in the report was the gradual decrease in research grants awarded to MDs (presumably doing research on patients) compared to PhDs (presumably doing research on animals).  Over a 10-year period, the percentage of MDs with awards had decreased from 20% to 4%.  While some attention has been paid to increasing translational research funding, the fact is that most grant reviewers are basic scientists and not very familiar with clinical testing and human subject research.  Animal studies are more easily controlled than human subject studies, so that there is an inherent difference that makes for lower funding scores for human studies.  It is incumbent on the research community to correct the discrepancy because we stand to lose public trust.  We now live in a world of immediate gratification and cures far off in the future will not be warmly considered.

Is the emphasis on genetics and molecular biology truly warranted?  Definitely, but not at the expense of advances that could improve the quality of life of patients now rather than later.  Some self-scrutiny is called for from the molecular biology community.  For example, researchers should realistically identify some of the limits of their own technology.  One area that needs such scrutiny is the knock-out mouse, in which a genetically modified mouse has undergone a process whereby a gene is kept from expressing or deleted from the mouse’s genome.  Knocking out the activity of a gene provides knowledge about the function of that gene, making for a marvelous model for the study of disease.  The process is complex, certainly cutting-edge, and  very effective if properly employed.  Because of the variety of genes, the technology has created an opportunity for many labs to develop their own knock-out mouse, thus leading to a myriad of new genetically modified mouse lines that researchers can make, buy, study, and manipulate.  

The scientists who developed the technology won the Nobel Prize for Physiology or Medicine in 2007.  The knock-out technology has to date been most successful in identifying genes related to cancer biology.  These genetically altered animals allow the study of genes in vivo, along with their responses to drugs.  The problem, however, has been the inability to generate animals that faithfully recapitulate the disease in man.  This glaring factor is understated- to the detriment of all.  In addition to the glaring fact that ~15% of knock-outs are lethal and some fail to produce observable changes, there is also the overlooked fact that knocking out a gene will up-regulate many other genes and down-regulate another large group of genes 6.  In nature, single-gene mutations that survive are very rare, so the knock-out is not simply a study of such mutations; it is an attempt at learning all that a single gene does.  The problem is that, without knowing which OTHER genes are up- or down-regulated, the knock-out animal represents an uncontrolled experiment on a creature that never would have existed in nature.  It is incumbent on representatives in the field to discuss these factors and adequately control their studies.

Some of these problems can be overcome by using conditional mutations, in which an agent added to the diet can induce a gene to be expressed or cease expressing temporarily.  The problem is that this approach does not control the up- or down-regulation of linked genes whose identities are unknown.  Moreover, none of these methods measure compensation.  Very few researchers verify how the absence of the gene creates compensation in expression of other genes.  For example, knocking out the gap junction protein connexin 36 creates a mouse without connexin 36, but the manipulation leads to overexpression of other connexins 7.  The field of knock-out mouse lines is expanding, growingly uncontrolled, and funded well beyond its current scientific affirmation.

A final issue is that, as far as the brain is concerned, protein transcription is a long-term process.  That is, the workings of the brain are in the millisecond range.  Over the last several minutes during the reading of this article, transcription was irrelevant.  None of the perceptive, attentional, or comprehensive elements of the information on these pages required gene transcription.  Of course, the long-term storage of the information into memory requires gene transcription, but not before.  Our brain takes about 200 miliseconds to consciously perceive a stimulus.  Gene action is in the order of minutes to hours, which is not in the same scale in terms of time.  Gene transcription is irrelevant during a conversation with friends about the latest news.  Assessing thought and movement in real time is too fast for genetic methods, but not for two technologies, the electroencephalogram (EEG) and the magnetoencephalogram (MEG).  

The EEG amplifies electrical signals from the underlying cortex (just from the surface of the brain, not from deep structures), but these signals are distorted by skull and scalp.  The MEG measures the magnetic fields of these electrical signals, but requires isolation by recording rooms, massive computational power, and superconductors that function in liquid helium.  The development of helium-free MEGs is here; such development would make the technology less expensive to operate.  The MEG is also very useful in producing exquisite localization of epileptic tissue, especially the initial ictal (seizure activity) event.  As such, it is reimbursable for diagnostic and surgical uses.  As the only real-time localizable measure of brain activity, the MEG is likely to make inroads into the rapid events in the brain.  Recent reports suggest that the MEG may also provide detailed images of any part of the body, including functioning muscle.  

The Revolution

To gain perspective, we have to understand the battle within the brain sciences that has led to the current state.  Subsequent to Sir Isaac Newton’s deterministic theories of how the world worked, there arose the idea that the brain worked the same way.  That is, – all brain function could be reduced to the smallest physical components, to the ultimate in micro-determinism.  This approach was manifested in the brain sciences in the form of “behaviorism,”an idea that all actions and thoughts were due to the physicochemical nature of the brain.  A major proponent was B. F. Skinner, who considered free will to be an illusion, and that everything you did depended on previous actions.  Advances in molecular biology and the structure of DNA fanned the fervor for this view.  There was no room for the consideration of consciousness or subjective states.  This was the world of the reductive micro-deterministic view of the person and the world.  These views influenced education and policy, suggesting that the issue was not to free man but to improve the way he is controlled — the “behaviorist”, one-way reductionist, view of the world in general and of the brain in particular.  

The implication of “behaviorism” for thought and action was that consciousness was an epiphenomenon of brain activity- and that the reductionist approach, if only enough details were known, provided a complete explanation of the material world.  This deterministic view of the world began to crumble under the weight of the discoveries of quantum mechanics.  The old deterministic idea of behavior and absence of free will were undermined by the advances in quantum mechanics.  Behaviorism thus was replaced by a “cognitive revolution” that espoused mental states as dynamic emergent properties of brain activity.  That is,  a two-way street existed between consciousness and the brain, fused to the brain activity of which it is an emergent property.  This is not to imply a dualism, two independent realms. Rather, mental states are fused with the brain processes that generate them.  This approach eliminated the duality of “brain”  versus “soul” or “mind.”-.  Just as evolution undermined the tenets of creationism, the cognitive revolution dissipated the suspicion of a separate “soul.”

The “cognitive revolution” implied a causal control of brain states downwards as well as upward determinism.  This two-way approach offers a solution of the free will versus- determinism paradox.  This cognitive approach retains both free will and determinism, integrated in a manner that provides moral responsibility 8. As a current scientific mainstream opinion,  volition remains causally determined but no longer subject to strict physicochemical laws.   It is one that no longer considers there to be a mind-versus-brain  paradox, but a singular, functionally interactive process.  Instead of placing the “mind” within physicochemical processes, thought became an emergent property of brain processes.  

To use a simplistic parallel, the brain is to thought and action as the orchestra is to music.  Thought and action are emergent properties of the brain just as music is an emergent property of the orchestra.  Music cannot exist without the orchestra, just as thought and action cannot exist without the brain.  This is a solitary relationship, one in which brain states influence thought and action (downwards), and the external world modulates the activity of the brain (upwards).  The “mind” or consciousness can be viewed as downward control of a system changing due to continuously impinging external inputs. 

This new viewpoint combines bottom-up determinism with top-down mental causation, the best of both worlds.  The world of reductionism is not entirely rejected, merely considered not to contain all the answers, and an entirely new outlook on nature is manifest.  The revolution in neuroscience has provided new values, whereby the world is driven not just by mindless physical forces, but also mental human values.  

References

[1] Karl R. Popper. 1983. “Realism and the Aim of Science.” In Postscript to the Logic of Scientific Discovery. W.W. Bartley III ed. New York: Routledge.

2 R. Heywood. 1990. “Clinical Toxicity – Could it have been predicted? Post-marketing experience.” In Animal Toxicity Studies: Their Relevance for Man. C.E. Lumley, and S. Walker, eds. Lancaster: Quay.

3 Niall Shanks, Ray Greek, and Jean Greek. 2009. “Are animal models predictive of humans?” Philos. Ethics Humani. Med. 4: 2.

4 E. Garcia-Rill.  2012.  Translational Neuroscience: a guide to a successful program. New York: Wiley-Blackwell.

5 Kohn, L.T., ed. 2004. Committee on the role of Academic Health Centers in the 21st Century, Academic Health Centers; Leading change in the 21st century. Washington, DC: National Academies Press.

6 D.A. Iacobas, E. Scemes, and D.C. Spray. 2004. “Gene expression alterations in connexin null mice extend beyond gap junctions.” Neurochem. Int. 45: 243-250.

7 D.C. Spray, and D.A. Iacobas. 2007. “Organizational principles of the connexin-related brain transcriptome.” J. Memb. Biol. 218: 39-47.

8 Roger Sperry. 1976. Changing concepts of consciousness and free will. Perspect. Biol. Med. 20: 9-19.


Edgar Garcia-Rill, Ph.D., is the Director for the Center for Translational Neuroscience and a professor in the Department of Neurobiology and Developmental Sciences at UAMS.

Filed Under: all, Non-fiction

Change

When I am old, I want to look back
And feel like my life really mattered.
But for now. I’m young and confused
Often getting bruised and battered.

From where I started it still is surprising
That I am where I am.
Perspective is gained when I pause
And look back to where I began

From humble beginnings I come
Once a kid full of rage
But I’ve learned to live is to love
And slowly I’ve started to change.

I’m not where I want to be yet
But I have faith that someday I might be,
The road is long. but I’ve grown strong
To beat the darkness I’m fighting

I think I’ve become someone different
And it makes me feel pride
To know I can change
I feel like I’m hitting my stride

I finally feel I belong
But part of me still feels uncertain
I try and resist the doubt
That soon they’ll be closing the curtain

I try to stay true and try to stay focused
I can’t help but fear I might fall.
Had clarity in very brief moments
But sometimes it’s hard to recall.

Sometimes it thunders and rains,
I feel like I’m falling apart,
But the storm always breaks before I do
And it feels like I get a fresh start

Like I’m on a boat out on the ocean
The skies are sunny and blue
The gentle breeze puts me in motion
The journey, exciting and new

I float away, not sure where I’m going
I wonder when I’ll get there too
But overwhelming the fear of drowning
Is the high of enjoying the view


Nick Wary is a medical student at UAMS.

Filed Under: all, Poetry

  • «Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 13
  • Page 14
  • Page 15
  • Page 16
  • Next Page»
University of Arkansas for Medical Sciences LogoUniversity of Arkansas for Medical SciencesUniversity of Arkansas for Medical Sciences
Mailing Address: 4301 West Markham Street, Little Rock, AR 72205
Phone: (501) 686-7000
  • Facebook
  • X
  • Instagram
  • YouTube
  • LinkedIn
  • Disclaimer
  • Terms of Use
  • Privacy Statement
  • Legal Notices

© 2026 University of Arkansas for Medical Sciences