Tuesday, February 25, 2014

Composites


"Shorter sentences and simple words!" was the battle cry of all my English teachers. Their comments and corrections of our English-language essays and homework assignments were very predictable. Apparently, they had all sworn allegiance to the same secret Fraternal Order of Syntax Police. I am sure that students of the English language all over the world have heard similar advice from their teachers, but English teachers at German schools excel in their diligent use of linguistic guillotines to chop up sentences and words. The problem is that they have to teach English to students who think, write and breathe in German, the lego of languages.



Lego blocks invite the observer to grab them and build marvelously creative and complex structures. The German language similarly invites its users to construct composite words and composite sentences. A virtually unlimited number of composite nouns can be created in German, begetting new words which consist of two, three or more components with meanings that extend far beyond the sum of their parts. The famous composite German word "Schadenfreude" is now used worldwide to describe the shameful emotion of joy when observing harm befall others. It combines "Schaden" (harm or damage) and "Freude" (joy), and its allure lies in the honest labeling of a guilty pleasure and the inherent tension of combining two seemingly discordant words.

The lego-like qualities of German can also be easily applied to how sentences are structured. Commas are a German writer's best friends. A German sentence can contain numerous clauses and sub-clauses, weaving a quilt of truths, tangents and tangential truths, all combined into the serpentine splendor of a single sentence. Readers may not enjoy navigating their way through such verschachtelt sentences, but writers take great pleasure in envisioning a reader who unwraps a sentence as if opening a matryoshka doll only to find that the last word of a mammoth sentence negates its fore-shadowed meaning.


Even though our teachers indulged such playfulness when we wrote in German, they were all the more harsh when it came to our English assignments. They knew that we had a hankering for creating long sentences, so they returned them to us covered in red ink markings, indicative of their syntactic fervor. This obsession with short sentences and words took the joy out of writing in English. German was the language of beauty and poetry, whereas English became the language best suited for efficient communication. By the time I reached my teenage years, I began to lose interest in writing anything in English beyond our mandatory school assignments. I still enjoyed reading books in English, such as the books of Enid Blyton, but I could not fathom how a language of simple sentences and simple words could be used to create works of literary beauty. This false notion fell apart when I first read "Things Fall Apart" by Chinua Achebe.



The decision to read "Things Fall Apart" was not completely arbitrary. My earliest memories of this world are those of the years I spent as a child in Igboland. My family moved from Pakistan to Germany when I was one year old, but we soon moved on to Nigeria. Germany was financing the rehabilitation of the electrical power grid that had been destroyed during the Biafra War. My father was one of the electrical engineers sent from Germany to help with the restoration and expansion of the electrical power supply in the South-Eastern part of Nigeria – the region which was home to the Igbo people and which had attempted and failed to secede as the Republic of Biafra.

We first stayed in Enugu, the former capital of the transient Republic of Biafra and then lived in the city of Aba. My memories of the time in Igboland are just sequences of images and scenes, and it is difficult to make sense of all of them: Kind and friendly people, palm trees and mysterious forests, riding a tricycle in elliptical loops, visits to electrical sub-stations. We returned to Germany when I was four years old. I would never live in the Igboland again, but recalling the fragmented memories of those early childhood years has always evoked a sense of comfort and joy in me. When I came across "Things Fall Apart" as a fourteen-year old and learned that it took place in an Igbo village, I knew that I simply had to read it.



I was not prepared for the impact the book would have on me. Great books shake us up, change us in a profound and unpredictable manner, leaving footprints that are etched into the rinds of our soul. "Things Fall Apart" was the first great English language book that I read. I was mesmerized by its language. This book was living proof that one could write a profound and beautiful book in English, using short, simple sentences.
 As the Ibo say: "When the moon is shining the cripple becomes hungry for a walk."
And so Okonkwo was ruled by one passion— to hate everything that his father Unoka had loved. One of those things was gentleness and another was idleness.
 Living fire begets cold, impotent ash.
A child cannot pay for its mother's milk.
It wasn't just the beautiful language, aphorisms, Igbo proverbs and haunting images that made this book so unique. "Things Fall Apart" contained no heroes. The books that I had read before "Things Fall Apart" usually made it obvious who the hero was. But "Things Fall Apart" was different. Okonkwo was no hero, not even a tragic hero. But he also was no villain. As with so many of the characters in the book, I could see myself in them and yet I was also disgusted by some of the abhorrent acts they committed. I wanted to like Okonkwo, but I could not like a man who participated in the killing of his adopted son or nearly killed his wife in a fit of anger.
Guns fired the last salute and the cannon rent the sky. And then from the center of the delirious fury came a cry of agony and shouts of horror. It was as if a spell had been cast. All was silent. In the center of the crowd a boy lay in a pool of blood. It was the dead man's sixteen-year-old son, who with his brothers and half-brothers had been dancing the traditional farewell to their father. Okonkwo's gun had exploded and a piece of iron had pierced the boy's heart.
Achebe was not judging or mocking his characters, but sharing them with us. He was telling us about how real humans think and behave. As I read the book, I felt that I was being initiated into life. Life would be messy. Most of us would end up being neither true heroes nor true villains but composites of heroism and villainy.  If I did not want end up like Okwonkwo, the ultimate non-negotiator, I needed to accept the fact that my life would be a series of negotiations: negotiations between individuals, negotiations between conflicting identities and negotiations between values and cultures. The book described a specific clash of cultures in colonial Africa, but it was easy to apply the same clash to so many other cultures. I tried to envision Okwonkwo as an Indian farmer whose world began to fall apart when Arab armies invaded the Sindh. I imagined Okwonkwo as a Native American, a Roman or a Japanese warrior, each negotiating his way through cultural upheavals. The history of humankind is always that of things falling apart and, importantly, that of rebuilding after the falling apart.    
As soon as the day broke, a large crowd of men from Ezeudu's quarter stormed Okonkwo's compound, dressed in garbs of war. They set fire to his houses, demolished his red walls, killed his animals and destroyed his barn. It was the justice of the earth goddess, and they were merely her messengers. They had no hatred in their hearts against Okonkwo. His greatest friend, Obierika, was among them. They were merely cleansing the land which Okonkwo had polluted with the blood of a clansman.
I read "Things Fall Apart" to find my past, but it defined my future. It helped me recognize the beauty of the English language and prepared me for life in a way that no book had ever done before.

Notes: All quotes are from "Things Fall Apart" by Chinua Achebe

Image Credits: Stack of "Things Fall Apart" (by Scartol via Wikimedia Commons), Photo of a porcelain insulator with a bullet hole probably from the Biafra war, Photo taken from the Presidential Hotel in Enugu 1973. 

Monday, February 17, 2014

Science Journalism and the Inner Swine Dog

A search of the PubMed database, which indexes scholarly biomedical articles, reveals that 997,508 articles were published in the year 2011, which amounts to roughly 2,700 articles per day. Since the database does not include all published biomedical research articles, the actual number of published biomedical papers is probably even higher. Most biomedical researchers work in defined research areas, so perhaps only 1% of the published articles may be relevant for their research. As an example, the major focus of my research is the biology of stem cells, so I narrowed down the PubMed search to articles containing the expression “stem cells”. I found that 14291 “stem cells” articles were published in 2011, which translates to an average of 39 articles per day (assuming that one reads scientific papers on week-ends and during vacations, which is probably true for most scientists). Many researchers also tend to have two or three areas of interest, which further increases the number of articles one needs to read.

Needless to say, it has become impossible for researchers to read all the articles published in their fields of interest, because if they did that, they would not have any time left to conduct experiments of their own. To avoid drowning in the information overload, researchers have developed multiple strategies how to survive and navigate their way through all this published data. These strategies include relying on recommendations of colleagues, focusing on articles published in high-impact journals, only perusing articles that are directly related to one’s own work or only reading articles that have been cited or featured in major review articles, editorials or commentaries. As a stem cell researcher, I can use the above-mentioned strategies to narrow down the stem cell articles that I ought to read to the manageable number of about three or four articles a day. However, scientific innovation in research is fueled by the cross-fertilization of ideas. The most creative ideas are derived from combining seemingly unrelated research questions. Therefore, the challenge for me is to not only stay informed about important developments in my own areas of interest. I also need to know about major developments in other scientific domains such as network theory, botany or neuroscience, because discoveries in such “distant” fields could inspire me to develop innovative approaches in my own work.


In order to keep up with scientific developments outside of my area of expertise, I have begun to rely on high-quality science journalism, which can be found in selected print and online publications or in science blogs. Good science journalists accurately convey complex scientific concepts in simple language, without oversimplifying the actual science. This is easier said than done, because it requires a solid understanding of the science as well as excellent communication skills. Most scientists are not trained to communicate to the general audience and most journalists have had very limited exposure to actual scientific work. To become good science journalists, either scientists have to be trained in the art of communicating results to non-specialists or journalists have to acquire the scientific knowledge pertinent to the topics they want to write about. The training of science journalists requires time, resources and good mentors.

Once they have completed their training and start working as science journalists, they still need adequate time, resources and mentors. When writing about an important new scientific development, good science journalists do not just repeat the information provided by the researchers or contained in the press release of the university where the research was conducted. Instead, science journalists perform the necessary fact-checking to ensure that the provided information is indeed correct. They also consult the scientific literature as well as other scientific experts to place the new development in the context of the existing research. Importantly, science journalists then analyze the new scientific development, separating the actual scientific data from speculation as well as point out limitations and implications of the work. 

Science journalists also write for a very broad audience, and this also poses a challenge. Their readership includes members of the general public interested in new scientific findings, politicians and members of the private industry that may base political and economic decisions on scientific findings, patients and physicians that want to stay informed about innovative new treatments and, as mentioned above, scientists that want to know about new scientific research outside of their area of expertise.

Unfortunately, I do not think that it is widely appreciated how important high-quality science journalism is and how much effort it requires. Limited resources, constraints on a journalist’s time and the pressure to publish sensationalist articles that exaggerate or oversimplify the science in order to attract a larger readership can compromise the quality of the work. Two recent examples illustrate this: The so-called Jonah Lehrer controversy, where the highly respected and popular science journalist Jonah Lehrer was found to fabricate quotes, plagiarize and oversimplify the research as well as the more recent case where the Japanese newspaper Yomiuri Shimbun ran a story about the use of induced pluripotent stem cells to treat patients with heart disease, which turned out to be a fraudulent claim of the researcher. The case of Jonah Lehrer was a big shock for me. I had enjoyed reading a number of his articles and blogs that he had written and, at first, it was difficult for me to accept that his work contained so many errors and evidence of misconduct. Boris Kachka has recently written a very profound analysis of the Jonah Lehrer controversy in New York Magazine:

Lehrer was the first of the Millennials to follow his elders into the dubious promised land of the convention hall, where the book, blog, TED talk, and article are merely delivery systems for a core commodity, the Insight.
The Insight is less of an idea than a conceit, a bit of alchemy that transforms minor studies into news, data into magic. Once the Insight is in place—Blink, Nudge, Free, The World Is Flat—the data becomes scaffolding. It can go in the book, along with any caveats, but it’s secondary. The purpose is not to substantiate but to enchant.
Kachka’s expression “Insight” describes our desire to believe in simple narratives. Any active scientist knows that scientific findings tend to be more complex and difficult to interpret than we anticipated. There are few simple truths or “Insights” in science, even though part of us wants to seek out these elusive simple truths. The metaphor that comes to mind is the German expression “der innere Schweinehund”. This literally translates to “the inner swine dog”. The expression may evoke the image of a chimeric pig-dog beast created by a mad German scientist in a Hollywood World War II movie, but in Germany this expression is actually used to describe a metaphorical inner creature that wants us to be lazy, seek out convenience and avoid challenges. In my view, scientific work is an ongoing battle with our “inner swine dog”. We start experiments with simple hypotheses and models, and we are usually quite pleased with results that confirm these anticipated findings because they allow us to be intellectually lazy. However, good scientists know that more often than not, scientific truths are complex and we need to force ourselves to continuously challenge our own scientific concepts. Usually this involves performing more experiments, analyzing more data and trying to interpret data from many different perspectives. Overcoming the intellectual laziness requires work, but most of us who are passionate about science enjoy these challenges and seek out opportunities to battle against our “inner swine dog” instead of succumbing to a state of perpetual intellectual laziness.

When I read Kachka’s description of why Lehrer was able to get away with his fabrications and over-simplifications, I realized that it was probably because Lehrer gave us the narratives we wanted to believe. He provided “Insight” - portraying scientific research in a false shroud of certainty and simplicity. Even though many of us look forward to overcoming intellectual laziness in our own work, we may not be used to challenging our “inner swine dog” when we learn about scientific topics outside of our own areas of expertise. This is precisely why we need good science journalists, who challenge us intellectually by avoiding over-simplifications.

A different but equally instructive case of poor science journalism occurred when the widely circulated Japanese newspaper Yomiuri Shimbun reported in early October of 2012 that the Japanese researcher Hisashi Moriguchi had transplanted induced pluripotent stem cells into patients with heart disease. This was quite a sensation, because it would have been the first transplantation of this kind of stem cells into real patients. For those of us in the field of stem cell research, this came as a big surprise and did not sound very believable, because the story suggested that the work had been performed in the United States and most of us knew that obtaining approvals for using such stem cells in clinical studies would have been very challenging. However, it is very likely that many people who were not acquainted with the complexities of using stem cells in patients may have believed the story. Within days, it became apparent that the researcher’s claims were fraudulent. He had said that he had conducted the studies at Harvard, but Harvard stated that he was not currently affiliated with them and there was no evidence of any such studies ever being conducted there. His claims of how he derived the cells and in how little time he supposedly performed the experiments were also debunked.

This was not the first incident of scientific fraud in the world of stem cell research and it unfortunately will not be the last. What makes this incident noteworthy is how the newspaper Yomiuri Shimbun responded to their reporting of these fraudulent claims. They removed the original story from their page and issued public apologies for their poor reporting. The English-language version of the newspaper listed the mistakes in an article entitled “iPS REPORTS--WHAT WENT WRONG / Moriguchi reporting left questions unanswered”. These problems include inadequate fact-checking regarding the researcher’s claims and affiliations by the reporter and lack of consultation with other scientists whether the findings sounded reasonable. Interestingly, the reporter had identified some red flags and concerns:

--Moriguchi had not published any research on animal experiments.
--The reporter had not been able to contact people who could confirm the iPS cell clinical applications.
--Moriguchi's affiliation with Harvard University could not be confirmed online.
--It was possible that different cells, instead of iPS cells, had been effective in the treatments.
--It was odd that what appeared to be major world news was appearing only in the form of a poster at a science conference.
--The reporter wondered if it was really possible that transplant operations using iPS cells had been approved at Harvard.
The reporter sent the e-mail to three others, including another news editor in charge of medical science, on the same day, and the reporter's regular updates on the topic were shared among them.
The science reporter said he felt "at ease" after informing the editors about such dubious points. After receiving explanations from Moriguchi, along with the video clip and other materials, the reporter sought opinions from only one expert and came to believe the doubts had been resolved.
In spite of these red flags, the reporter and the editors decided to run the story. The reporter and the editors gave in to their intellectual laziness and desire of running a sensational story instead of tediously following up on all the red flags. They had a story about a Japanese researcher making a ground-breaking discovery in a very competitive area of stem cell research and this was the story that their readers would probably love. This unprofessional conduct is why the reporter and the editors received reprimands and penalties for their actions. Another article in the newspaper summarizes the punitive measures:

Effective as of next Thursday, The Yomiuri Shimbun will take disciplinary action against the following officials and employees:
--Yoshimitsu Ohashi, senior managing director and managing editor of the company, and Takeshi Mizoguchi, corporate officer and senior deputy managing editor, will each return 30 percent of their remuneration and salary for two months.
--Fumitaka Shibata, a deputy managing editor and editor of the Science News Department, will be replaced and his salary will be reduced.
--Another deputy managing editor in charge of editorial work for the Oct. 11 edition will receive an official reprimand.
--The salaries of two deputy editors of the Science News Department will be cut.
--A reporter in charge of the Oct. 11 series will receive an official reprimand.
I have mixed feelings about these punitive actions. I think it is commendable that the newspaper made apologies without reservations or excuses and listed its mistakes. The reprimands and penalties also highlight that the newspaper takes it science journalism very seriously and recognizes the importance of high professional standards. The penalties were also more severe for its editors than for the reporter, which may reflect the fact that the reporter did consult with the editors and they decided to run the story even though the red flags had been pointed out to them. My concerns arise from the fact that I am not sure punitive actions will solve the problem and they leave a lot of questions unanswered. Did the newspaper evaluate whether the science journalists and editors had been appropriately trained? Did the science journalist have the time and resources to conduct his or her research in a conscientious manner? Importantly, will science journalists be given the appropriate resources and protected from pressures or constraints that encourage unprofessional science journalism? We do not know the answers to these questions, but providing the infrastructure for high quality science journalism is probably going to be more useful than mere punitive actions. We can also hope that media organizations all over the world learn from this incident and recognize the importance of science journalism and put mechanisms in place to ensure that its quality.
Note: Image via Wikimedia Commons/ Norbert Schnitzler: Statue “Mein Innerer Schweinhund” in Bonn. An earlier version of this article was first published on my Next Regeneration blog.

Sunday, February 16, 2014

“It Is An Opportunity For Great Joy”

I was about 12 years old when I found out that my grandfather was born on 12/12/12. If he were alive, he would be exactly 100 years old today. I found out about his birthday, when he came to stay with us in Munich for an eye surgery. He was a diabetic and had been experiencing deterioration in his vision. At that time, it was very difficult to find an eye surgeon in Pakistan who would be able to perform the surgery. My grandfather spoke many languages, such as Punjabi, Urdu, Persian, English, Arabic and some Sanskrit, but he could not speak German. His visit occurred during my school holidays, so I was designated to be his official translator for the doctor visits and his hospital stay.



On the afternoon before his surgery, we went to the hospital and I was filling out the registration forms, when I asked my grandfather about his birthday and he said 12/12/12. I was quite surprised to find out that he had such a wonderful combination of numbers, when the lady at the registration desk saw the date and asked me whether he was absolutely sure this was the correct date. I translated this for my grandfather and he smiled and said something along the lines of, “It is more or less the correct date. Nobody is exactly sure, but it is definitely very easy to remember”. I knew that I was supposed be a translator, but this required a bit more finesse than a straightforward translation. One cannot tell a German civil servant that a date is more or less correct. If we introduced uncertainty at this juncture, who knows what the consequences would be.

 I therefore paraphrased my grandfather’s response as, “Yes, it is absolutely correct!”

 She then said, “Eine Schnapszahl!”

 My grandfather wanted me to translate this, and I was again at a loss for words. Schnapszahl literally means Schnapsnumber and is a German expression for repeated digits, such as 33 or 555. The origin of the word probably lies in either the fact that a drunken person may have transient double vision or in a drinking game where one drinks Schnaps after reaching repeated digits when adding up numbers. I was not quite sure how to translate this into Urdu without having to go into the whole background of how German idioms often jokingly refer to alcohol.

 I decided to translate her comment as “What a memorable date”, and my grandfather nodded.

 We were then seen by a medical resident who also pointed out the unique birthday. His comment was “Darauf sollten wir einen trinken!”, which is another German idiom and translates to „we should all have a drink to celebrate this”, but really just means “Hooray!”  or “Great!”

 My grandfather wanted to know what the doctor had said and I was again in a quandary. Should I give him accurate translation and explain that this is just another German idiom and is not intended as a cultural insult to a Pakistani Muslim? Or should I just skip the whole alcohol bit? Translation between languages is tough enough, but translating and showing cultural sensitivity was more than I could handle. My Urdu was not very good to begin with, and all I could come up with the rather silly Urdu translation “It is an opportunity for great joy”. My grandfather gave me a puzzled look, but did not ask any questions.  

 ***** 

 On the day after my grand-father’s eye surgery, the ophthalmologist and the residents came by for morning rounds.  They removed his eye-patch, inspected the eye and told me that everything looked great. He just needed a few more days of recovery and would soon be able to go home. After putting the gauze and eye-patch back on, the doctors moved on to the next patient.

 Once the doctors had completed rounds, I made the acquaintance of the head nurse. She seemed to think that the eye ward was her military regiment and was running it like a drill-sergeant. She walked into every room and ordered all the patients to get out of bed and walk to the common area. Only lazy people stayed in bed, she said. The best way to recuperate was to move about.  

 I told her that I did not think my grandfather was ready to get up. 

“Did any doctor forbid him to get up?”

 “No, not really”, I replied.

 “If he has two legs, he can walk to the common room. If not, we will provide a wheelchair.”

 “He just had surgery yesterday and needs to rest”, I protested and pointed to my grandfather’s eye-patch.

 “Yesterday was yesterday and today is today!” was the response from the drill-sergeant.

 This statement did not seem very profound to me and I was waiting for a further explanation, but the drill-sergeant had already moved on, ordering the patients from the neighboring rooms to get up. My grandfather and I did not have much of a choice, so we joined the procession of one-eyed men who looked like retired, frail pirates. They were slowly shuffling out of their rooms towards the common area.

 The common area consisted of chairs and sofas as well as a couple of tables. I sat down in a corner with my grandfather, and we started talking. He told me stories from his life, including vivid descriptions of how he and his friends proudly defied the British colonialists. My grandfather recited poems from the Gulistan of the Persian poet Saadi for me in Persian and translated them into Urdu. He wanted to know about German history and what I was learning at school. He asked me if I knew any poems by Goethe, because the Indian poet Iqbal had been such a great admirer of Goethe’s poetry.

 We talked for hours. Like most children, I did not realize how much I enjoyed the conversations. It was only years later when my grandfather passed away that I wished I had taken notes of my conversations with him. All I currently have are fragmented memories of our conversations, but I treasure these few fragments.

 I then pulled out a tiny travel chess set that I had brought along, and we started playing chess. I knew that he had trouble distinguishing some of the pieces because of his eye surgery. I took advantage of his visual disability and won every game. During my conversations with my grandfather and our chess games, I noticed that some of the other men were staring at us. Perhaps they were irritated by having a child around. Maybe they did not like our continuous chatting or perhaps they just did not like us foreign-looking folks. I tried to ignore their stares, but they still made me quite uncomfortable.

 On the next day, we went through the same procedure. Morning rounds, drill sergeant ordering everyone to the common area, conversations with my grandfather and our chess games. The stares of the other patients were now really bothering me. I was wondering whether I should walk up to one of the men and ask him whether they had a problem with me and my grandfather. Before I could muster the courage, one of the men got up and walked towards us. I was a bit worried, not knowing what the man was going to do or say to us.

“Can you ask your grandfather, if I can borrow you?”

 “Borrow me?”, I asked, taken aback.

 “He gets to tell you all these stories and play chess with you for hours and hours, and I also want to have someone to talk to.”

 Once he had said that, another patient who was silently observing us chimed in and said that he would like to know if he could “borrow” me for a game of chess. I felt really stupid. The other patients who had been staring at me and my grandfather were not at all racist or angry towards us, they were simply envious of the fact that my grandfather had someone who would listen to him.

 I tried to translate this for my grandfather, but I did not know how to translate “borrow”. My grandfather smiled and understood immediately what the men wanted, and told me that I should talk to as many of the patients as possible. He told me that the opportunity to listen to others was a mutual blessing, both for the narrator as well as the listener.

 On that day and the next few days that my grandfather spent in the hospital, I spoke to many of the men and listened to their stories about their lives, their health, their work and even stories about World War 2 and life in post-war Germany. I also remember how I agreed to play chess, but when I pulled out my puny little travel chess set, my opponent laughed and brought a huge chess set from a cupboard in the common area. He beat me and so did my grandfather who then also played chess with me on this giant-size chess board which obliterated the visual advantage that my travel set had offered.

 *********** 

 Since that time I spent with my grandfather and the other patients on the eye ward, I have associated medicine with narration. All humans want to be narrators, but many have difficulties finding listeners. Illness is often a time of vulnerability and loneliness. Narrating stories during this time of vulnerability is a way to connect to fellow human beings, which helps overcome the loneliness. The listeners can be family members, friends or even strangers. Unfortunately, many people who are ill do not have access to family members or friends who are willing to listen. This is the reason why healthcare professionals such as nurses or physicians can serve a very important role. We listen to patients so that we can obtain clues about their health, searching for symptoms that can lead to a diagnosis. However, sometimes the process of listening itself can be therapeutic in the sense that it provides comfort to the patient.

 Even though I mostly work as a cell biologist, I still devote some time to the practice of medicine. What I like about being a physician is the opportunity to listen to patients or their family members. I prescribe all the necessary medications and tests according to the cardiology guidelines, but I have noticed that my listening to the patients and giving them an opportunity to narrate their story provides an immediate relief.

 It is an indeed an  “an opportunity for great joy”, when the patient experiences the joy of having an audience and the healthcare provider experiences the joy of connecting with the patient. I have often wondered whether there is any good surrogate for listening to the patient. Medicine is moving towards reducing face-to-face time between healthcare providers and patients in order to cut costs or maximize profits. The telemedicine approach in which patients are assessed by physicians who are in other geographic locations is gaining ground. Patients now often fill out checklists about their history instead of narrating it to the physicians or nurses. All of these developments are reducing the opportunity for the narrator-listener interaction between patients and healthcare providers. However, social networks, blogs and online discussion groups may provide patients the opportunities to narrate their stories (those directly related to their health as well as other stories) and find an audience. I personally prefer the old-fashioned style of narration. The listener can give instant feedback and the facial expressions and subtle nuances can help reassure the narrator. The key is to respect the narrative process in medicine and to help the patients find ways to narrate their stories in a manner that they are comfortable with.

Note: An earlier version of this article was first published on the Next Regeneration blog.

'The Epichorus': Creative Heretics Build Bridges Between Faiths

When we hear the word religion, we often associate it with violence or conflict. The reason for this is that religion is frequently used as a means to ostracize or demean fellow humans and to justify violence. Political and socioeconomic causes of conflicts are also sometimes masked by invoking religious differences. This turns religion into a handy tool to promote certain agendas and perpetuate conflicts without addressing the underlying political and socioeconomic problems.



To address this abuse of religion, spiritual leaders and visionaries try to organize interfaith events. The hope is that when people of different religious backgrounds meet each other, they will engage in a dialogue. This should in turn allow them to recognize the similarities between their faith traditions and move past perceived differences. Such a fruitful interfaith dialogue could circumvent the abuse of religion as a tool for promoting conflicts. However, the problem with such interfaith events is that they do not necessarily result in a true dialogue. During many of the interfaith events that I have attended, people of different faiths politely present ideas, traditions or doctrines from their respective religions and engage in a discussion. Instead of this encounter becoming a transforming dialogue, participants sometimes end up merely "performing" monologues for each other.

 One reason for the absence of true dialogue may be the inertia that prevents us from leaving our comfort zone. Few of us like to question the religious paradigms or doctrines that we grow up with. True dialogue is a spiritual and intellectual adventure. It is an exploration or journey that requires courage and openness, because it might challenge the religious dogmas that we like to cling to for the sake of convenience.

 I recently came across what is a beautiful form of true interfaith dialogue: the music of the band the Epichorus. Rabbinical student Zach Fredman and Muslim singer Alsarah co-founded the band and their musical love-child is the wonderful album One Bead. In this album, Zach, Alsarah and the other members of the band combine Jewish and Sudanese-Arab musical traditions to create music that transcends the boundaries of culture or religion. The lyrics of the songs are mostly drawn from the Jewish tradition, such as the "Song of Songs" (Song of Solomon) from the Old Testament, but the album also includes the traditional Sudanese love song "Nanaa Al Genina" (The Mint Garden).



 The common theme of the One Bead songs is love, the emotion that is at the core of our human existence and spirituality. Listening to the music, one feels a profound sense of harmony that exists between the various cultural and religious traditions that are part of the Epichorus. The lyrics for two of the songs are taken from the "Song of Songs" and this reminded me of something that the German poet Johann Wolfgang von Goethe wrote. He composed a cycle of poems called "West-östlicher Diwan" (or "West-Eastern Divan" in English). Goethe wrote these poems to represent a fusion between Eastern and Western traditions. He also wrote essays in which he elaborated on his poems and one of his comments specifically refers to the Old Testament "Song of Songs", of which he says, "...als dem Zartesten und Unnachahmlichsten, was uns von Ausdruck leidenschaftlicher, anmutiger Liebe zugekommen," which translates into English as: "...it is the most tender and unique expression of passionate and graceful love that has been given to us."

 I asked Zach how he chose the name the Epichorus for their band and he said that it was a reference to Epikoros (or Apikoros), which is a term used in the Jewish tradition to describe outsiders or heretics. The members of the Epichorus are indeed outsiders in the sense that they have the courage to look beyond the boundaries of their religious traditions and have sought out a creative dialogue with people outside of their faith traditions. They are also "heretics" in the original Greek sense of the word, describing people who "make choices." They chose to embark on a creative adventure and found that they could engage in an authentic dialogue by creating beautiful songs together.

 The music of the Epichorus is an excellent example of how creating culture together can promote true interfaith dialogue. Hopefully, we will hear more songs from the Epichorus, but I also hope that other artists, musicians or poets will be inspired by them and seek out their own creative paths to foster dialogue between faiths.

Note: An earlier version of this article was first published in the Huffington Post.

“Occidentophobia”: The Elephant in the Room

Scapegoating Muslims has become a convenient tool for promoting a far right political agenda. A Center for American Progress (CAP) report carefully outlined anti-Muslim fear-mongering in the United States, with the long-term hope that, by exposing the roots of anti-Muslim hostility, strategies can be developed to overcome such prejudice. However, relatively little attention is paid to "Occidentophobia," or more appropriately (since, like 'Islamophobia," it does not constitute a true "phobia") anti-Western sentiments among Muslims.

An international Pew Research Center report, with the innocuous title "Muslim-Western Tensions Persist," discussed the extent of Muslim anti-Western prejudice. The report summarized the results of a survey of Western stereotypes of Muslims living in predominantly Muslim countries such as Indonesia, Egypt and Pakistan. A median of 68 percent of the Muslim respondents associated "selfish" with Westerners, while only a median of 35 percent of non-Muslims living in countries such as Great Britain, the United States or Germany associated "selfish" with Muslims. One could conceivably attribute anti-Western hostility to the fact that Muslims living in predominantly Muslim countries may not have come into personal contact with Westerners. After all, this Pew survey report did not provide data about the attitudes of Muslims living in the West. However, an earlier Pew Research Center report released in 2006 titled "The Great Divide: How Muslims and Westerners View Each Other" also surveyed Muslims living in European countries and offered a fairly bleak picture of the anti-Western prejudice among European Muslims. Muslims in Britain had an especially negative response, as 69 percent of those surveyed attributed three or more negative traits such as "greedy," "selfish," "arrogant" or "immoral" to Westerners. This antagonistic attitude was in sharp contrast to the comparatively positive views of the non-Muslim general public in Britain, of whom only 30 percent attributed three or more negative traits to Muslims.

 The "selfish" or "greedy" traits attributed by Muslims in Britain to Westerners are especially noteworthy, since the majority of the top-ranking ranking countries in terms of charitable behavior, as measured by the World Giving Index of the Charities Aid Foundation (CAF), happen to be countries that are typically associated with "Western culture," such as Australia, Canada, Switzerland, USA or Great Britain. According to the CAF, people living in Western countries appear to be far more likely to donate money or volunteer their time than those living in predominantly Muslim countries. This even holds true for relatively wealthy Muslim countries such as the UAE or Saudi Arabia, which came in at ranks 50 and 86 for charitable behavior, whereas Great Britain achieved an international rank of 8. These facts therefore raise questions about the seeming mis-perception of Western culture. Is the Muslim anti-Western prejudice due to ignorance, or is it the consequence of a very selective view of Western society? Is such anti-Western prejudice perhaps fueled by organizations with a political or ideological agenda, similar to those that promote anti-Muslim prejudice as uncovered by the Center for American Progress? Does anti-Western prejudice of Muslims living in the West manifest itself differently in Europe and in North America?

 Western Muslims frequently emphasize the rich cultural heritage and diversity of Islam as a means to combat anti-Muslim prejudice. At interfaith or intercultural events, Western Muslims discuss poems of the philosopher-poet Rumi and depict the vast array of cultural traditions in the Muslim world, ranging from Indian and Pakistani Qawwali music to Moroccan cuisine. For many non-Muslim Westerners, the abstract entities "Muslim" or "Islam" that were previously associated with fear are thus transformed into vibrant human encounters and allows them to move beyond their prejudice.

 However, when I discuss the concept of Western culture with Western Muslims, I often find that their perception of Western culture does not include the same desirable standard of openness. "The West" is regularly seen as some combination of loss of moral values, imperialism and drone attacks -- a description reminiscent of the Star Trek Borg species that assimilates into and then destroys other cultures. Many struggle with recognizing their own "Western" identity and few seem to associate "the West" with its grand cultural heritage, which reaches far back to Plato's The Republic but also includes Baroque music, Friedan's "The Feminine Mystique," and the environmental movement. Even though many Western Muslims that I have met encountered Western culture and history during their grade school and university education, these encounters appear to be frequently tinged with an unnecessary denigration of Western culture. Allowing Muslims in Europe and North America to appreciate the rich mosaic and diversity that comprises "Western" culture might be an important step towards overcoming anti-Western prejudice.

 In addition to these anecdotal experiences, analytical studies such as the one conducted by the Center for American Progress are also necessary to help uncover the extent and roots of anti-Western prejudice among Muslims living in the West. The efforts to understand and combat anti-Muslim prejudice need to be accompanied by equal efforts to understand and combat anti-Western prejudice. The discussions about Muslim anti-Western prejudice and hostility are currently often confined to far right organizations in Europe and North America and are thus coupled with a specific agenda to malign Muslims using outrageous assertions and exaggerations. The analysis of Muslim anti-Western hostility deserves more serious scrutiny. Muslim anti-Western prejudice is sometimes treated like an elephant in the room that is ignored, even by liberal-progressive organizations and movements. It would be a fallacy to believe that prejudice and hostilities between Muslims and non-Muslims can be resolved by just asking non-Muslims to show more tolerance and understanding, without demanding reciprocity from Muslims.

  An earlier version of this article was published on the Guernica blog.

Thursday, February 13, 2014

Creativity in Older Adults: Learning Digital Photography Improves Cognitive Function


The unprecedented increase in the mean life expectancy during the past centuries and a concomitant drop in the birth rate has resulted in a major demographic shift in most parts of the world. The proportion of fellow humans older than 65 years of age is higher than at any time before in our history. This trend of generalized population ageing will likely continue in developed as well as in developing countries. Population ageing has sadly also given rise to ageism, prejudice against the elderly. In 1950, more than 20% of citizens aged 65 years or older participate used to participate in the labor workforce of the developed world. The percentage now has dropped to below 10%. If the value of a human being is primarily based on their economic productivity – as is so commonly done in societies driven by neoliberal capitalist values – it is easy to see why prejudices against senior citizens are on the rise. They are viewed as non-productive members of society who do not contribute to the economic growth and instead represent an economic burden because they sap up valuable dollars required to treat chronic illnesses associated with old age.


In "Agewise: Fighting the New Ageism in America", the scholar and cultural critic Margaret Morganroth Gullette ties the rise of ageism to unfettered capitalism:
There are larger social forces at work that might make everyone, male or female, white or nonwhite, wary of the future. Under American capitalism, with productivity so fetishized, retirement from paid work can move you into the ranks of the "unproductive" who are bleeding society. One vile interpretation of longevity (that more people living longer produces intolerable medical expense) makes the long-lived a national threat, and another (that very long-lived people lack adequate quality of life) is a direct attack on the progress narratives of those who expect to live to a good old age. Self-esteem in later life, the oxygen of selfhood, is likely to be asphyxiated by the spreading hostile rhetoric about the unnecessary and expendable costs of "aging America".
Instead of recognizing the value of the creative potential, wisdom and experiences that senior citizens can share with their respective communities, we are treating them as if they were merely a financial liability. The rise of neo-liberalism and the monetization of our lives are not unique to the United States and it is likely that such capitalist values are also fueling ageism in other parts of the world. Watching this growing disdain for senior citizens is especially painful for those of us who grew up inspired by our elders and who have respected their intellect and guidance they can offer.


In her book, Gullette also explores the cultural dimension of cognitive decline that occurs with aging and how it contributes to ageism. As our minds age, most of us will experience some degree of cognitive decline such as memory loss, deceleration in our ability to learn or process information. In certain disease states such as Alzheimer's dementia or vascular dementia (usually due to strokes or ‘mini-strokes'), the degree of cognitive impairment can be quite severe. However, as Gullete points out, the dichotomy between dementia and non-dementia is often an oversimplification. Cognitive impairment with aging represents a broad continuum. Not every form of dementia is severe and not every cognitive impairment – whether or not it is directly associated with a diagnosis of dementia – is global. Episodic memory loss in an aging person does not necessarily mean that the person has lost his or her ability to play a musical instrument or write a poem. However, in a climate of ageism, labels such as "dementia" or "cognitive impairment" are sometimes used as a convenient excuse to marginalize and ignore aged fellow humans.

Perhaps I am simply getting older or maybe some of my academic colleagues have placed me on the marketing lists of cognitive impairment snake oil salesmen. My junk mail folder used to be full of emails promising hours of sexual pleasure if I purchased herbal Viagra equivalents. However, in the past months I have received a number of junk emails trying to sell nutritional supplements which can supposedly boost my memory and cognitive skills and restore the intellectual vigor of my youth. As much as I would like strengthen my cognitive skills by popping a few pills, there is no scientific data that supports the efficacy of such treatments. A recent article by Naqvi and colleagues reviewed randomized controlled trials– the ‘gold standard' for testing the efficacy of medical treatments – did not find any definitive scientific data that vitamin supplements or herbs such as Ginkgo can improve cognitive function in the elderly. The emerging consensus is that based on the currently available data, there are two basic interventions which are best suited for improving cognitive function or preventing cognitive decline in older adults: regular physical activity and cognitive training.

Cognitive training is a rather broad approach and can range from enrolling older adults in formal education classes to teaching participants exercises that enhance specific cognitive skills such as improving short-term memory. One of the key issues with studies which investigate the impact of cognitive training in older adults has been the difficulty of narrowing down what aspect of the training is actually beneficial. Is it merely being enrolled in a structured activity or is it the challenging nature of the program which improves cognitive skills? Does it matter what type of education the participants are receiving? The lack of appropriate control groups in some studies has made it difficult to interpret the results.

The recent study "The Impact of Sustained Engagement on Cognitive Function in Older Adults: The Synapse Project" published in the journal Psychological Science by the psychology researcher Denise Park and her colleagues at the University of Texas at Dallas is an example of an extremely well-designed study which attempts to tease out the benefits of participating in a structured activity versus receiving formal education and acquiring new skills. The researchers assigned subjects with a mean age of 72 years (259 participants were enrolled, but only 221 subjects completed the whole study) to participate in 14-week program in one of five intervention groups: 1) learning digital photography, 2) learning how to make quilts, 3) learning both digital photography and quilting (half of the time spent in each program), 4) a "social condition" in which the members participated in a social club involving activities such as cooking, playing games, watching movies, reminiscing, going on regular field trips but without the acquisition of any specific new skills or 5) a "placebo condition" in which participants were provided with documentaries, informative magazines, word games and puzzles, classical-music CDs and asked to perform and log at least 15 hours a week of such activities. None of the participants carried a diagnosis of dementia and they were novices to the areas of digital photography or quilting. Upon subsequent review of the activities in each of the five intervention groups, it turned out that each group spent an average of about 16-18 hours per week in the aforementioned activities, without any significant difference between the groups. Lastly, a sixth group of participants was not enrolled in any specific program but merely asked to keep a log of their activities and used as a no-intervention control.

When the researchers assessed the cognitive skills of the participants after the 14-week period, the type of activity they had been enrolled in had a significant impact on their cognition. For example, the participants in the photography class had a much greater degree of improvement in their episodic memory and their visuospatial processing than the placebo condition. On the other hand, cognitive processing speed of the participants increased most in the dual condition group (photography and quilting) as well as the social condition. The general trend was that the groups which placed the highest cognitive demands on the participants and also challenged them to be creative (acquiring digital photography skills, learning to make quilts) showed the greatest improvements.

However, there are key limitations of the study. Since only 221 participants were divided across six groups, each individual group was fairly small. Repeating this study with a larger sample would increase the statistical power of the study and provide more definitive results. Furthermore, the cognitive assessments were performed soon after completion of the 14-week programs. Would the photography group show sustained memory benefits even a year after completion of the 14-week program? Would the participants continue to be engaged in digital photography long after completion of the respective courses?

Despite these limitations, there is an important take-home message of this study: Cognitive skills in older adults can indeed be improved, especially if they are exposed to an unfamiliar terrain and asked to actively acquire new cognitive skills. Merely watching educational documentaries or completing puzzles ("placebo condition") is not enough. This research will likely spark many future studies which will help define the specific mechanisms of how acquiring new skills leads to improved memory function and also studies that perhaps individualize cognitive training. Some older adults may benefit most from learning digital photography, others might benefit from acquiring science skills or participating in creative writing workshops. This research also gives us hope as to how we can break the vicious cycle of ageism in which older citizens are marginalized because of cognitive decline, but this marginalization itself further accelerates their decline. By providing opportunities to channel their creativity, we can improve their cognitive function and ensure that they remain engaged in the community.

There are many examples of people who have defied the odds and broken the glass ceiling of ageism. I felt a special sense of pride when I saw my uncle Jamil's name on the 2011 Man Asian Literary Prize shortlist for his book The Wandering Falcon: He was nominated for a ‘debut' novel at the age of 78. It is true that the inter-connected tales of the "The Wandering Falcon" were inspired by his work and life in the tribal areas of the Pakistan-Afghanistan borderlands when he was starting out as a young civil servant and that he completed the first manuscript drafts of these stories in the 1970s. But these stories remained unpublished, squirreled away and biding their time until they would eventually be published nearly four decades later. They would have withered away in this cocooned state, if it hadn't been for his younger brother Javed, who prodded the long-retired Jamil, convincing him to dig up, rework and submit those fascinating tales for publication. Fortunately, my uncle found a literary agent and publisher who were not deterred by his advanced age and recognized the immense value of his writing.

When we help older adults tap into their creative potential, we can engender a new culture of respect for the creativity and intellect of our elders.

Further Reading:
  1. Gullette, Margaret Morganroth. Agewise: Fighting the new ageism in America. University of Chicago Press, 2011.
  2. Naqvi, Raza et al "Preventing cognitive decline in healthy older adultsCMAJ July 9, 2013 185:881-885.doi: 10.1503/cmaj.121448
  3. Park, Denise C et al "The Impact of Sustained Engagement on Cognitive Function in Older Adults", published online on Nov 8, 2013 in Psychological Science doi:10.1177/0956797613499592
Note: An earlier version of this article was first published on 3quarksdaily.com.


ResearchBlogging.org Park DC, Lodi-Smith J, Drew L, Haber S, Hebrank A, Bischof GN, & Aamodt W (2014). The impact of sustained engagement on cognitive function in older adults: the synapse project. Psychological science, 25 (1), 103-12 PMID: 24214244

Wednesday, February 12, 2014

Three Seconds: Poems, Cubes and the Brain

A child drops a chocolate chip cookie on the floor, immediately picks it up, looks quizzically at a parental eye-witness and proceeds to munch on it after receiving an approving nod. This is one of the versions of the "three second rule", which suggests that food can be safely consumed if it has had less than three seconds contact with the floor. There is really no scientific basis for this legend, because noxious chemicals or microbial flora do not bide their time, counting "One one thousand, two one thousand, three one thousand,…" before they latch on to a chocolate chip cookie. Food will likely accumulate more bacteria, the longer it is in contact with the floor, but I am not aware of any rigorous scientific study that has measured the impact of food-floor intercourse on a second-to-second basis and identified three seconds as a critical temporal threshold. Basketball connoisseurs occasionally argue about a very different version of the "three second rule", and the Urban Dictionary provides us with yet another set of definitions for the "three second rule", such as the time after which one loses a vacated seat in a public setting. I was not aware of any of these "three second rule" versions until I moved to the USA, but I had come across the elusive "three seconds" time interval in a rather different context when I worked at the Institute of Medical Psychology in Munich: Stimuli or signals that occur within an interval of up to three seconds are processed and integrated by our brain into a "subjective present".




I joined the Institute of Medical Psychology at the University of Munich as a research student in 1992 primarily because of my mentor Till Roenneberg. His intellect, charm and infectious enthusiasm were simply irresistible. I scrapped all my plans to work on HIV, cancer or cardiovascular disease and instead began researching the internal clock of marine algae in Till's laboratory – in an Institute of Medical Psychology. Within weeks of working at the institute, I realized how fortunate I was. Ernst Pöppel, one of Germany's leading neuroscientists and the director of the institute, had created a multidisciplinary research heaven. Ernst assembled a team of remarkably diverse researchers who studied neurobiology, psychology, linguistics, mathematics, philosophy, endocrinology, cell physiology, marine biology, computer science, ecology – all on the same floor.   Since I left the institute nearly 20 years ago, I have worked in many academic departments at various institutions, each claiming to value multidisciplinary studies, but I have never again encountered any place that has been able to successfully integrate natural sciences, social sciences and the humanities in the same way as the Munich institute.

The central, unifying theme of the institute was time. Not physical time, but biological and psychological time. How does our brain perceive physical time? What is the structure of perceived time? What regulates biological oscillations in humans, animals and even algae? Can environmental cues modify temporal perception?   The close proximity of so many disciplines made for fascinating coffee-break discussions, forcing us to re-evaluate our own research findings in the light of the discoveries made in neighboring labs and inspired us to become more creative in our experimental design.

Some of the most interesting discussions I remember revolved around the concept of the subjective present, i.e. the question of what it is that we perceive as the "now". Our brain continuously receives input from our senses, such as images we see, sounds we hear or sensations of touch. For our brain to process these stimuli appropriately, it creates a temporal structure so that it can tell apart preceding stimuli from subsequent stimuli. But the brain not only assigns a temporal order to the stimuli, it also integrates them and conveys to us a sense of the subjective past and the subjective present. We often use vague phrases such as "living in the moment" and we all have a sense of what is the "now", but we do not always realize what time intervals we are referring to. If we just saw an image or heard a musical note one second ago, physical time would clearly place them in "the past". Decades of research performed by Ernst Pöppel and his colleagues at the institute, as well as several other laboratories around the world, suggest that our brain integrates our subjective temporal reality in chunks of approximately three second duration.

Temporal order can be assessed in a rather straightforward experimental manner. Research subjects can be provided sequential auditory clicks, one to each ear. If the clicks are one second apart, nearly all participants can correctly identify whether or not the click in the right ear came before the one in the left ear. It turns out that this holds true even if the clicks are only 100 milliseconds (0.1 seconds) apart. The threshold for being able to correctly assign a temporal order to such brief stimuli lies around 30 milliseconds for young adults (up to 25 years old) and 60 milliseconds for older adults.

Temporal integration of stimuli, on the other hand, cannot be directly measured through experiments. It is not possible to ask research subjects "Are these two stimuli part of your now?" and expect a definitive answer, because everyone has a different concept and definition of what constitutes "now". Therefore, researchers such as Ernst Pöppel have had to resort to indirect assessments of temporal integration, and ascertain what interval of time is grasped as a perceptual unit by our brain. An excellent summary of the work can be found in the paper "A hierarchical model of temporal perception". Instead of reviewing the hundreds of experiments that have lead researchers to derive the three-second interval, I will just review two studies which I believe are among the most interesting.

In one of the studies, Pöppel partnered up with the American poet Frederick Turner. Turner and Pöppel recorded and measured hundreds of Latin, Greek, English, Chinese, Japanese, French and German poems, analyzing the length of each LINE. They used the expression LINE to describe a "fundamental unit of metered poetry". In many cases, a standard verse or line in a poem did indeed fit the Turner-Pöppel definition of a LINE, but they used the more generic LINE for their analysis because not all languages or orthographic traditions write or print a LINE in a separate space as is common in English or German poems. If a long line in a poem was divided by a caesura into two sections, Turner and Pöppel considered this to be two LINES.

The basic idea behind this analysis was that each unit of a poem (LINE) conveys one integrated idea or thought, and that the reader experiences each LINE as a "now" moment while reading the poem. Turner and Pöppel published their results in the classic essay "The Neural Lyre: Poetic Meter, the Brain, and Time" for which they also received the Levinson Prize in 1983. Their findings were quite remarkable. The peak duration of LINES in poems was between 2.5 seconds and 3.5 seconds, independent of what language the poems were written in. For example, 73% of German poems had a LINE duration between 2 and 3 seconds. Here are some their other specific findings:
Japanese
Epic meter (a seven-syllable line followed by a five-syllable one) (average)  3.25 secs.
Waka (average)  2.75 secs.
Tanka (recited much faster than the epic, as 3 LINES of 5, 12, and 14 syllables) (average)  2.70 secs.
Chinese
Four-syllable line   2.20 secs.
Five-syllable line    3.00 secs.
Seven-syllable line   3.80 secs.
English
Pentameter   3.30 secs.
Seven-syllable trochaic line   2.50 secs.
Stanzas using different line lengths    3.00 secs., 3.10 secs.
Ballad meter (octosyllabic)    2.40 secs.
Poets all around the world did not conspire to write three-second LINES. It is more likely that our brain may be attuned to processing poetic information in 3 second chunks and that poets are subconsciously aware of this. This was not a controlled, rigorous scientific study, but the results are nevertheless fascinating, not only because they points towards the three second interval that neuroscientists have established in recent decades for temporal integration in the brain, but also because they suggest that the rules for metered poetry may be universal. I strongly advise everyone to read the now classic essay by Turner and Pöppel, to then try reading aloud their own favorite poems and see if the LINES indeed approximate three seconds.


A second approach to glean into the inner workings of temporal integration process in our brain is the use of perceptual reversal experiments, such as those performed with the Necker cube. This cube is a 2-D line drawing, which our brain perceives as a cube – or actually two distinct cubes. Most people who stare at the drawing for a while will notice that their mind creates two distinct cube representations. Once the mind perceives the two different cubes, it becomes very difficult to cling to just one cube representation. Our brain starts flip-flopping between the two cubes; even when we try our best to just hang on to one of the cube representations in our mind. Interestingly, the average duration that it takes for our mind to automatically shift from one cube representation to the other one approximates three seconds.

Nicole von Steinbüchel, a colleague of Ernst Pöppel at the Institute of Medical Psychology, asked a fascinating question. If the oscillatory perceptual shift between the two cube representations is indeed indicative of the "subjective present" and the temporal integration capacity, would brain injury affect the oscillation? She studied patients who had brain lesions (usually due to a stroke) in either the left or right hemisphere of the brain. She and her team of researchers were able to show that while healthy participants reported a three second interval between the automatic shifting of the cube representations in their brain, the average shift time was four seconds in patients with brain damage in the left brain hemisphere and up to six seconds if the damage had occurred in a certain part of the right brain hemisphere. Nicole von Steinbüchel's research demonstrates the clinical relevance of studying temporal integration, but it also suggests that the brain may have designated areas which specialize in creating a temporal structure.

The analysis of poetry and the Necker cube experiments are just two examples of cognitive studies indicating that our brain uses three second intervals to process information and generate the experience of the "now" or the "subjective present". Taken alone, none of these studies are a conclusive proof that our brain uses three second intervals, but one cannot help but notice a remarkable convergence of data pointing towards a cognitive three second rule.

References:
Frederick Turner and Ernst Pöppel (1983) "The Neural Lyre: Poetic Meter, the Brain, and TimePoetry 142(5): 277-309.

Ernst Pöppel (1997) "A hierarchical model of temporal perceptionTrends in Cognitive Sciences 1(2): 56-61.

Nicole von Steinbüchel (1998) "Temporal ranges of central nervous processing: clinical evidenceExperimental Brain Research 123 (1-2): 220-233.

Note: An earlier version of this article was first published on 3quarksdaily.com.


ResearchBlogging.org von Steinbüchel N (1998). Temporal ranges of central nervous processing: clinical evidence. Experimental brain research, 123 (1-2), 220-33 PMID: 9835412

'Islamophobia' is Not a Phobia

The expression "Islamophobia" to describe anti-Muslim hostility in Europe and North America is becoming increasingly common. Even though the use of this expression is still not quite as trendy as a Lady Gaga song, one can already find numerous books, articles and websites that expound on the phenomenon of "Islamophobia". The views on "Islamophobia" are quite diverse, ranging from people on one end of the spectrum who do not believe that there is any significant anti-Muslim hostility in Europe and North America and no need for the term "Islamophobia", to those on the other end of the spectrum who frequently invoke the term "Islamophobia" to describe perceived hostility towards Muslims or Islam.



 During the last few months at least three books on the topic of "Islamophobia" have been published by academics and scholars: Islamophobia: The Ideological Campaign Against Muslims by Stephen Sheehi, Islamophobia by Chris Allen and Islamophobia: The Challenge of Pluralism in the 21st Century edited by John L. Esposito and Ibrahim Kalin. These books discuss a variety of topics related to the phenomenon of "Islamophobia", such as its various manifestations in politics and the mass media, its historical roots and development, the overlap of "Islamophobia" with racism and how "Islamophobia" relates to colonialism and imperialism. The actual definition of "Islamophobia" is not discussed in much detail. Most of these books use the term "Islamophobia" to describe various types of fear, prejudice, discrimination or hostility directed against Islam and Muslims, in part basing this vague definition of "Islamophobia" on the 1997 report of the Runnymede Trust in the U.K., which was one of the first documents to use the actual expression "Islamophobia". Chris Allen's book devotes one whole chapter to develop a new definition for "Islamophobia", but even the proposed new definition of "Islamophobia" remains understandably vague, since the numerous manifestations of prejudice cannot be easily captured in a short definition.

 While so many examples of what constitutes "Islamophobia" are presented, little effort is devoted to clarifying what does not constitute "Islamophobia". As the widespread usage of the expression "Islamophobia" is increasing, the danger of a vague and broad definition becomes apparent. Without a reasonable effort to delineate what is and what is not "Islamophobia", this term could be easily used to stigmatize or suppress legitimate criticisms of Muslim society, culture or theology. Not every rejected mosque building permit is necessarily a form of anti-Muslim discrimination, not every criticism of Muslim society, culture or religion is necessarily a manifestation of an "Islamophobic agenda". Academic scholars that use the expression "Islamophobia" are likely aware of the need to use this term in a narrow sense, so that it refers to true prejudice or hostility towards Muslims and is not abused to suppress legitimate critical views of Muslim society, culture or theology. However, for the expression "Islamophobia" (or any other expression that describes anti-Muslim prejudice and hostility) to be used in a meaningful manner by the wider public, there is a need to clearly formulate what does not constitute "Islamophobia".

 In addition to the vagueness of the "Islamophobia" expression, another troubling aspect of this neologism is the fact that it invokes the psychiatric concept of "phobia". Phobias fall under the category of anxiety disorders and describe pathological fears; while many know the term from the infamous expression "arachnophobia" (pathological fear of spiders), many different types of phobias have been observed in patients. The standard manual of the American Psychiatric Association is the Diagnostic and Statistical Manual of Mental Disorders (DSM IV-TR) and refers to "Specific Phobia" as a,
"Marked and persistent fear that is excessive or unreasonable, cued by the presence or anticipation of a specific object or situation (e.g., flying, heights, animals, receiving an injection, seeing blood)."
There are additional criteria that characterize a phobia, but I find the following one extremely interesting: "The person recognizes that the fear is excessive or unreasonable for discussing the term."

This is quite important since not every fear is automatically a "phobia"; the psychiatric term "phobia" is reserved for cases when the fear is excessive or unreasonable. If the patient does not recognize the fear as excessive or unreasonable, it becomes very difficult to actually prove that the fear is indeed excessive and unreasonable and thus the term "phobia" is not applicable. When neologisms with the word "phobia" are formed, this requirement should be considered. I myself use the term "porkophobia" to describe my own ridiculous and unreasonable dislike of pig products that by far exceeds any religious prescriptions. My understanding is that most people who are accused of having "Islamophobia" do not really think that their fears are excessive and unreasonable. Therefore, anti-Muslim fears, hostility or prejudice do not really constitute a "phobia" in the psychiatric sense and thus the use of the neologism "Islamophobia" may need to be re-evaluated.

 Lastly, I have encountered multiple Muslims who have likened present-day "Islamophobia" in Europe and North America to Anti-Semitism. My obvious response is that this is an absurd comparison since European Anti-Semitism resulted in the murder of millions of Jews in concentration camps and death camps during the Holocaust while no such camps were ever built to murder Muslims. The counter-response I have then gotten is that present-day "Islamophobia" may be similar to the pre-Holocaust Anti-Semitism in Europe. However, even this comparison contains a number of key flaws. A thoughtful analysis of this comparison can be found in Matti Bunzl's book "Anti-Semitism and Islamophobia: Hatreds Old and New in Europe", which points out that there are some parallels between "Islamophobia" and Anti-Semitism in Europe, such as the fact that the far-right political parties have traditionally used Anti-Semitism as a means of creating a unified base of voters, but that these far-right parties are now replacing Anti-Semitism with anti-Muslim hostility to achieve that same goal. However, Bunzl goes on to show how different the roots and development of Anti-Semitism and "Islamophobia" are. Importantly, he suggests that Anti-Semitism developed as a form of racial hatred of Jews in the 19th century, but that it had been preceded by centuries of Jewish persecution by Christians on religious grounds. On the other hand, Bunzl proposes that present-day "Islamophobia" is neither a true theological hostility nor a racial hatred like Anti-Semitism, but instead represents a perceived clash of civilizations.

 In conclusion, the increasingly common usage of the expression "Islamophobia" requires a thoughtful and clear definition of what does and does not constitute anti-Muslim prejudice and hostility, a re-evaluation of whether "Islamophobia" is truly the most appropriate term to describe such perceived anti-Muslim prejudice and hostility and an avoidance of inappropriate blanket comparisons between anti-Muslim hostility and Anti-Semitism.

Note: An earlier version of this article was first published in the Huffington Post