Showing posts with label creativity. Show all posts
Showing posts with label creativity. Show all posts

Sunday, March 9, 2014

There Was No Couch: On Mental Illness and Creativity


The psychiatrist held the door open for me and my first thought as I entered the room wasWhere is the couch?”. Instead of the expected leather couch, I saw a patient lying down on a flat operation table surrounded by monitors, devices, electrodes, and a team of physicians and nurses. The psychiatrist had asked me if I wanted to join him during an “ECT” for a patient with severe depression. It was the first day of my psychiatry rotation at the VA (Veterans Affairs Medical Center) in San Diego, and as a German medical student I was not yet used to the acronymophilia of American physicians. I nodded without admitting that I had no clue what “ECT” stood for, hoping that it would become apparent once I sat down with the psychiatrist and the depressed patient.


I had big expectations for this clinical rotation.  German medical schools allow students to perform their clinical rotations during their final year at academic medical centers overseas, and I had been fortunate enough to arrange for a psychiatry rotation in San Diego. The University of California (UCSD) and the VA in San Diego were known for their excellent psychiatry program and there was the added bonus of living in San Diego. Prior to this rotation in 1995, most of my exposure to psychiatry had taken the form of medical school lectures, theoretical textbook knowledge and rather limited exposure to actual psychiatric patients. This may have been part of the reason why I had a rather naïve and romanticized view of psychiatry. I thought that the mental anguish of psychiatric patients would foster their creativity and that they were somehow plunging from one existentialist crisis into another. I was hoping to engage in some witty repartee with the creative patients and that I would learn from their philosophical insights about the actual meaning of life. I imagined that interactions with psychiatric patients would be similar to those that I had seen in Woody Allen’s movies: a neurotic, but intelligent artist or author would be sitting on a leather couch and sharing his dreams and anxieties with his psychiatrist.

I quietly stood in a corner of the ECT room, eavesdropping on the conversations between the psychiatrist, the patient and the other physicians in the room. I gradually began to understand that that “ECT” stood for “Electroconvulsive Therapy”. The patient had severe depression and had failed to respond to multiple antidepressant medications. He would now receive ECT, what was commonly known as electroshock therapy, a measure that was reserved for only very severe cases of refractory mental illness. After the patient was sedated, the psychiatrist initiated the electrical charge that induced a small seizure in the patient. I watched the arms and legs of the patients jerk and shake. Instead of participating in a Woody-Allen-style discussion with a patient, I had ended up in a scene reminiscent of “One Flew Over the Cuckoo's Nest”, a silent witness to a method that I thought was both antiquated and barbaric. The ECT procedure did not take very long, and we left the room to let the sedation wear off and give the patient some time to rest and recover. As I walked away from the room, I realized that my ridiculously glamorized image of mental illness was already beginning to fall apart on the first day of my rotation. 

During the subsequent weeks, I received an eye-opening crash course in psychiatry. I became acquainted with DSM-IV, the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders which was the sacred scripture of American psychiatry according to which mental illnesses were diagnosed and classified. I learned ECT was reserved for the most severe cases, and that a typical patient was usually prescribed medications such as anti-psychotics, mood stabilizers or anti-depressants. I was surprised to see that psychoanalysis had gone out of fashion. Depictions of the USA in German popular culture and Hollywood movies had led me to believe that many, if not most, Americans had their own personal psychoanalysts. My psychiatry rotation at the VA took place in the mid 1990s, the boom time for psychoactive medications such as Prozac and the concomitant demise of psychoanalysis.


I found it exceedingly difficult to work with the DSM-IV and to appropriately diagnose patients. The two biggest obstacles I encountered were a) determining cause –effect relationships in mental illness and b) distinguishing between regular human emotions and true mental illness. The DSM-IV criteria for diagnosing a “Major Depressive Episode”, included depressive symptoms such as sadness or guilt which were severe enough to “cause clinically significant distress or impairment in social, occupational, or other important areas of functioning”. I had seen a number of patients who were very sad and had lost their job, but I could not determine whether the sadness had impaired their “occupational functioning” or whether they had first lost their job and this had in turn caused profound sadness. Any determination of causality was based on the self-report of patients, and their memories of event sequences were highly subjective.

The distinction between “regular” human emotions and mental illness was another challenge for me and the criteria in the DSM-IV manual seemed so broad that what I would have considered “sadness” was now being labeled as a Major Depression. A number of patients that I saw had severe mental illnesses such as depression, a condition so disabling that they could hardly eat, sleep or work. The patient who had undergone ECT on my first day belonged to that category. However, the majority of patients exhibited only some impairment in their sleep or eating patterns and experienced a degree of sadness or anxiety that I had seen in myself or my friends. I had considered transient episodes of anxiety or unhappiness as part of the spectrum of human emotional experience. The problem I saw with the patients in my psychiatry rotation was these patients were not only being labeled with a diagnosis such as “Major Depression”, but were then prescribed antidepressant medications without any clear plan to ever take them off the medications. By coincidence, that year I met the forensic psychiatrist Ansar Haroun, who was also on faculty at UCSD and was able to help me with my concerns. Due to his extensive work in the court system and his rigorous analysis of mental states for legal proceedings, Haroun was an expert on causality in psychiatry as well the definition of what constitutes a truly pathological mental state.

Regarding the issue of causality, Haroun explained to me the complexity of the mind and mental states makes it extremely difficult to clearly define cause and effect relationships in psychiatry. In infectious diseases, for example, specific bacteria can be identified by laboratory tests as causes of a fever. The fever normally does not precede the bacterial infection nor does it cause the bacterial infection.  The diagnosis of mental illnesses, on the other hand, rests on subjective assessments of patients and is further complicated by the fact that there are no clearly defined biological causes or even objective markers of most mental illnesses. Psychiatric diagnoses are therefore often based on patterns of symptoms and a presumed causality. If a patient exhibits symptoms of a depressed mood and has also lost his or her job during that same time period, psychiatrists then have to diagnose whether the depression was the cause of losing the job or whether the job loss caused depressive symptoms. In my limited experience with psychiatry and the many discussions I have had with practicing psychiatrists, it appears that the leeway given to psychiatrists to assess cause-effect relationships may result in an over-diagnosis of mental illnesses or an over-estimation of their impact.

I also learnt from Haroun that the question of how to address the distinction between the spectrum of “regular” human emotions and actual mental illness had resulted in a very active debate in the field of psychiatry. Haroun directed me towards the writings of Tom Szasz, who was a brilliant psychiatrist but also a critic of psychiatry, repeatedly pointing out the limited scientific evidence for diagnoses of mental illness.  Szasz’ book “The Myth of Mental Illness” was first published in 1960 and challenged the foundations of modern psychiatry. One of his core criticisms of psychiatry was that his colleagues had begun to over-diagnose mental illnesses by blurring the boundaries between everyday emotions and true diseases. Every dis-ease (discomfort) was being turned into a disease that required a therapy. The reasons for this overreach by psychiatry were manifold, ranging from society and the state trying to regulate what was acceptable or normal behavior to psychiatrists and pharmaceutical companies that would benefit financially from the over-diagnosis of mental illness. An excellent overview of his essays can be found in his book “The Medicalization of Everyday Life”. Even though Tom Szasz passed away earlier this year, psychiatrists and researchers are now increasingly voicing their concerns about the direction that modern psychiatry has taken. Allan Horwitz and Jerome Wakefield, for example, have recently published “The Loss of Sadness: How Psychiatry Transformed Normal Sorrow into Depressive Disorder” and “All We Have to Fear: Psychiatry's Transformation of Natural Anxieties into Mental Disorders”. Unlike Szasz who even went as far as denying the existence of mental illness, Horowitz and Wakefield have taken a more nuanced approach. They accept the existence of true mental illnesses, admit these illnesses can be disabling and acknowledge the patients who are afflicted by mental illnesses do require psychiatric treatment. However, Horowitz and Wakefield criticize the massive over-diagnosis of mental illness and point out the need to distinguish true mental illnesses from normal sadness and anxiety.


Before I started my psychiatry rotation in San Diego, I had been convinced that mental illness fostered creativity. I had never really studied the question in much detail, but there were constant references in popular culture, movies, books and TV shows to the creative minds of patients with mental illness. The supposed link between mental illness and creativity was so engrained in my mind that the word “psychotic” automatically evoked images of van Gogh’s paintings and other geniuses whose creative minds were fueled by the bizarreness of their thoughts. Once I began seeing psychiatric patients who truly suffered from severe disabling mental illnesses, it became very difficult for me to maintain this romanticized view of mental illness. People who truly suffered from severe depression had difficulties even getting out of bed, getting dressed and meeting their basic needs. It was difficult to envision someone suffering from such a disabling condition to be able to write large volumes of poetry or to analyze the data from ground-breaking experiments. The brilliant book “Creativity and Madness: New Findings and Old Stereotypes” by Albert Rothenberg helped me understand that the supposed link between creativity and mental illness was primarily based on myths, anecdotes and a selection bias in which the creative accomplishments of patients with mental illness were glorified and attributed to the illness itself. Geniuses who suffered from schizophrenia or depression were not creative because of their mental illness but in spite of their mental illness.

I began to realize that the over-diagnosis of mental illness and the departure of causality that had become characteristic for contemporary psychiatry also helped foster the myth that mental illness enhances creativity. Many beautiful pieces of literature or art can be inspired by emotional states such as the sadness of unrequited love or the death of a loved one. Creativity is often a response to a state of discomfort or dis-ease, an attempt to seek out comfort. However, if definitions of mental illness are broadened to the extent that nearly every such dis-ease is considered a disease, one can easily fall into the trap of believing that mental illness indeed begets creativity. In respect to establishing causality, Rothenberg found, contrary to the prevailing myth, mental illness was actually a disabling condition that prevented creative minds from completing their artistic or scientific tasks. A few years ago, I came across “Poets on Prozac: Mental Illness, Treatment, and the Creative Process” a collection of essays written by poets who suffer from mental illness. The personal accounts of most poets suggest that their mental illnesses did not help them write their poetry, but actually acted as major hindrances. It was only when their illness was adequately treated and they were in a state of remission that they were able to write poems. A recent comprehensive analysis of studies that attempt to link creativity and mental illness can be found in the excellent textbook “Explaining Creativity: The Science of Human Innovation” by Keith Sawyer, who concludes that there is no scientific evidence for the claim that mental illness promotes creativity. He also points to a possible origin of this myth:
The mental illness myth is based in cultural conceptions of creativity that date from the Romantic era, as a pure expression of inner inspiration, an isolated genius, unconstrained by reason and convention.
I assumed that the myth had finally been laid to rest, but, to my surprise I came across the headline Creativity 'closely entwined with mental illness' on the BBC website in October 2012. The BBC story was referring to the large-scale Swedish study “Mental illness, suicide and creativity: 40-Year prospective total population study” by Simon Kyaga and his colleagues at the Karolinska Institute, published online in the Journal of Psychiatric Research. The BBC news report stated “Creativity is often part of a mental illness, with writers particularly susceptible, according to a study of more than a million people” and continued:
Lead researcher Dr Simon Kyaga said the findings suggested disorders should be viewed in a new light and that certain traits might be beneficial or desirable.
For example, the restrictive and intense interests of someone with autism and the manic drive of a person with bipolar disorder might provide the necessary focus and determination for genius and creativity.
Similarly, the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece.
These statements went against nearly all the recent scientific literature on the supposed link between creativity and mental illness and once again rehashed the tired, romanticized myth of the mentally ill genius. I was puzzled by these claims and decided to read the original paper. There was the additional benefit of learning more about the mental health of Swedes, because my wife is a Swedish-American. It never hurts to know more about the mental health or the creative potential of one’s spouse.

Kyaga’s study did not measure creativity itself, but merely assessed correlations between self-reported “creative professions” and the diagnoses of mental illness in the Swedish population. Creative professions included scientific professions (primarily scientists and university faculty members) as well as artistic professions such as visual artists, authors, dancers and musicians. The deeply flawed assumption of the study was that if an individual has a “creative profession”, he or she has a higher likelihood of being a creative person. Accountants were used as a “control”, implying that being an accountant does not involve much creativity. This may hold true for Sweden, but the creativity of accountants in the USA has been demonstrated by the recent plethora of financial scandals. The size of the Kyaga study was quite impressive, involving over one million patients and collecting data on the relatives of patients. The fact that Sweden has a total population of about 9.5 million and that more than one million of its adult citizens are registered in a national database as having at least one mental illness is both remarkable and worrisome.
The main outcome was the likelihood that patients with certain mental illnesses such as depression, schizophrenia or anxiety disorders were engaged in a “creative profession”. The results of the study directly contradicted the BBC hyperbole:
We found no positive association between psychopathology and overall creative professions except for bipolar disorder. Rather, individuals holding creative professions had a significantly reduced likelihood of being diagnosed with schizophrenia, schizoaffective disorder, unipolar depression, anxiety disorders, alcohol abuse, drug abuse, autism, ADHD, or of committing suicide.
Not only did the authors fail to find a positive correlation between creative professions and mental illnesses (with the exception of bipolar disorder), they actually found the opposite of what they had suspected: Patients with mental illnesses were less likely to engage in a creative profession.
Their findings do not come as a surprise to anyone who has been following the scientific literature on this topic. After all, the disabling features of mental illness make it very difficult to maintain a creative profession. Kyaga and colleagues also presented a contrived subgroup analysis, to test whether there was any group within the “creative professions” that showed a positive correlation with mental illness. It appears contrived, because they only break down the artistic professions, but did not perform a similar analysis for the scientific professions. Among all these subgroup analyses, the researchers found a positive correlation between the self-reported profession ‘author’ and a number of mental illnesses. However, they also found that other artistic professions did not show such a positive correlation.

 How the results of this study gave rise to the blatant misinterpretation reported by the BBC that “the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece” is a mystery in itself. It shows the power of the myth of the mad genius and how myths and convictions can tempt us to misinterpret data in a way that maintains the mythic narrative. The myth may also be an important component in the attempt to medicalize everyday emotions. The notion that mental illness fosters creativity could make the diagnosis more palatable. You may be mentally ill, but don’t worry, because it might inspire you to paint like van Gogh or write poems like Sylvia Plath.

A study of the prevalence of mental illness published in the Archives of General Psychiatry in 2005 estimated that roughly half of all Americans will have been diagnosed with a mental illness by time they reach the age of 75. This estimate was based on the DSM-IV criteria for mental illness, but the newer DSM-V manual will be released in 2013 and is likely to further expand the diagnosis of mental illness. The DSM-IV criteria had made allowance for bereavement to avoid diagnosing people who were profoundly sad after the loss of a loved one with the mental illness depression. This bereavement exemption will likely be removed from the new DSM-V criteria so that the diagnosis of major depression can be used even during the grieving period. The small group of patients who are afflicted with disabling mental illness do not find their suffering to be glamorous. There is a large number of patients who are experiencing normal sadness or anxiety and end up being inappropriately diagnosed with mental illness using broad and lax criteria of what constitutes an illness. Are these patients comforted by romanticized myths about mental illness? The continuing over-reach of psychiatry in its attempt to medicalize emotions, supported by the pharmaceutical industry that reaps large profits from this over-reach, should be of great concern to all of society. We need to wade through the fog of pseudoscience and myths to consider the difference between dis-ease and disease and the cost of medicalizing human emotions.



Image Credit: Wikimedia Commons Public Domain ECT machine (1960s) by Nasko and Self-Portait of van Gogh.

An earlier version of this article was first published on the 3Quarksdaily Blog.

ResearchBlogging.org Kyaga, S., Landén, M., Boman, M., Hultman, C., Långström, N., & Lichtenstein, P. (2013). Mental illness, suicide and creativity: 40-Year prospective total population study Journal of Psychiatric Research, 47 (1), 83-90 DOI: 10.1016/j.jpsychires.2012.09.010

Thursday, February 13, 2014

Creativity in Older Adults: Learning Digital Photography Improves Cognitive Function


The unprecedented increase in the mean life expectancy during the past centuries and a concomitant drop in the birth rate has resulted in a major demographic shift in most parts of the world. The proportion of fellow humans older than 65 years of age is higher than at any time before in our history. This trend of generalized population ageing will likely continue in developed as well as in developing countries. Population ageing has sadly also given rise to ageism, prejudice against the elderly. In 1950, more than 20% of citizens aged 65 years or older participate used to participate in the labor workforce of the developed world. The percentage now has dropped to below 10%. If the value of a human being is primarily based on their economic productivity – as is so commonly done in societies driven by neoliberal capitalist values – it is easy to see why prejudices against senior citizens are on the rise. They are viewed as non-productive members of society who do not contribute to the economic growth and instead represent an economic burden because they sap up valuable dollars required to treat chronic illnesses associated with old age.


In "Agewise: Fighting the New Ageism in America", the scholar and cultural critic Margaret Morganroth Gullette ties the rise of ageism to unfettered capitalism:
There are larger social forces at work that might make everyone, male or female, white or nonwhite, wary of the future. Under American capitalism, with productivity so fetishized, retirement from paid work can move you into the ranks of the "unproductive" who are bleeding society. One vile interpretation of longevity (that more people living longer produces intolerable medical expense) makes the long-lived a national threat, and another (that very long-lived people lack adequate quality of life) is a direct attack on the progress narratives of those who expect to live to a good old age. Self-esteem in later life, the oxygen of selfhood, is likely to be asphyxiated by the spreading hostile rhetoric about the unnecessary and expendable costs of "aging America".
Instead of recognizing the value of the creative potential, wisdom and experiences that senior citizens can share with their respective communities, we are treating them as if they were merely a financial liability. The rise of neo-liberalism and the monetization of our lives are not unique to the United States and it is likely that such capitalist values are also fueling ageism in other parts of the world. Watching this growing disdain for senior citizens is especially painful for those of us who grew up inspired by our elders and who have respected their intellect and guidance they can offer.


In her book, Gullette also explores the cultural dimension of cognitive decline that occurs with aging and how it contributes to ageism. As our minds age, most of us will experience some degree of cognitive decline such as memory loss, deceleration in our ability to learn or process information. In certain disease states such as Alzheimer's dementia or vascular dementia (usually due to strokes or ‘mini-strokes'), the degree of cognitive impairment can be quite severe. However, as Gullete points out, the dichotomy between dementia and non-dementia is often an oversimplification. Cognitive impairment with aging represents a broad continuum. Not every form of dementia is severe and not every cognitive impairment – whether or not it is directly associated with a diagnosis of dementia – is global. Episodic memory loss in an aging person does not necessarily mean that the person has lost his or her ability to play a musical instrument or write a poem. However, in a climate of ageism, labels such as "dementia" or "cognitive impairment" are sometimes used as a convenient excuse to marginalize and ignore aged fellow humans.

Perhaps I am simply getting older or maybe some of my academic colleagues have placed me on the marketing lists of cognitive impairment snake oil salesmen. My junk mail folder used to be full of emails promising hours of sexual pleasure if I purchased herbal Viagra equivalents. However, in the past months I have received a number of junk emails trying to sell nutritional supplements which can supposedly boost my memory and cognitive skills and restore the intellectual vigor of my youth. As much as I would like strengthen my cognitive skills by popping a few pills, there is no scientific data that supports the efficacy of such treatments. A recent article by Naqvi and colleagues reviewed randomized controlled trials– the ‘gold standard' for testing the efficacy of medical treatments – did not find any definitive scientific data that vitamin supplements or herbs such as Ginkgo can improve cognitive function in the elderly. The emerging consensus is that based on the currently available data, there are two basic interventions which are best suited for improving cognitive function or preventing cognitive decline in older adults: regular physical activity and cognitive training.

Cognitive training is a rather broad approach and can range from enrolling older adults in formal education classes to teaching participants exercises that enhance specific cognitive skills such as improving short-term memory. One of the key issues with studies which investigate the impact of cognitive training in older adults has been the difficulty of narrowing down what aspect of the training is actually beneficial. Is it merely being enrolled in a structured activity or is it the challenging nature of the program which improves cognitive skills? Does it matter what type of education the participants are receiving? The lack of appropriate control groups in some studies has made it difficult to interpret the results.

The recent study "The Impact of Sustained Engagement on Cognitive Function in Older Adults: The Synapse Project" published in the journal Psychological Science by the psychology researcher Denise Park and her colleagues at the University of Texas at Dallas is an example of an extremely well-designed study which attempts to tease out the benefits of participating in a structured activity versus receiving formal education and acquiring new skills. The researchers assigned subjects with a mean age of 72 years (259 participants were enrolled, but only 221 subjects completed the whole study) to participate in 14-week program in one of five intervention groups: 1) learning digital photography, 2) learning how to make quilts, 3) learning both digital photography and quilting (half of the time spent in each program), 4) a "social condition" in which the members participated in a social club involving activities such as cooking, playing games, watching movies, reminiscing, going on regular field trips but without the acquisition of any specific new skills or 5) a "placebo condition" in which participants were provided with documentaries, informative magazines, word games and puzzles, classical-music CDs and asked to perform and log at least 15 hours a week of such activities. None of the participants carried a diagnosis of dementia and they were novices to the areas of digital photography or quilting. Upon subsequent review of the activities in each of the five intervention groups, it turned out that each group spent an average of about 16-18 hours per week in the aforementioned activities, without any significant difference between the groups. Lastly, a sixth group of participants was not enrolled in any specific program but merely asked to keep a log of their activities and used as a no-intervention control.

When the researchers assessed the cognitive skills of the participants after the 14-week period, the type of activity they had been enrolled in had a significant impact on their cognition. For example, the participants in the photography class had a much greater degree of improvement in their episodic memory and their visuospatial processing than the placebo condition. On the other hand, cognitive processing speed of the participants increased most in the dual condition group (photography and quilting) as well as the social condition. The general trend was that the groups which placed the highest cognitive demands on the participants and also challenged them to be creative (acquiring digital photography skills, learning to make quilts) showed the greatest improvements.

However, there are key limitations of the study. Since only 221 participants were divided across six groups, each individual group was fairly small. Repeating this study with a larger sample would increase the statistical power of the study and provide more definitive results. Furthermore, the cognitive assessments were performed soon after completion of the 14-week programs. Would the photography group show sustained memory benefits even a year after completion of the 14-week program? Would the participants continue to be engaged in digital photography long after completion of the respective courses?

Despite these limitations, there is an important take-home message of this study: Cognitive skills in older adults can indeed be improved, especially if they are exposed to an unfamiliar terrain and asked to actively acquire new cognitive skills. Merely watching educational documentaries or completing puzzles ("placebo condition") is not enough. This research will likely spark many future studies which will help define the specific mechanisms of how acquiring new skills leads to improved memory function and also studies that perhaps individualize cognitive training. Some older adults may benefit most from learning digital photography, others might benefit from acquiring science skills or participating in creative writing workshops. This research also gives us hope as to how we can break the vicious cycle of ageism in which older citizens are marginalized because of cognitive decline, but this marginalization itself further accelerates their decline. By providing opportunities to channel their creativity, we can improve their cognitive function and ensure that they remain engaged in the community.

There are many examples of people who have defied the odds and broken the glass ceiling of ageism. I felt a special sense of pride when I saw my uncle Jamil's name on the 2011 Man Asian Literary Prize shortlist for his book The Wandering Falcon: He was nominated for a ‘debut' novel at the age of 78. It is true that the inter-connected tales of the "The Wandering Falcon" were inspired by his work and life in the tribal areas of the Pakistan-Afghanistan borderlands when he was starting out as a young civil servant and that he completed the first manuscript drafts of these stories in the 1970s. But these stories remained unpublished, squirreled away and biding their time until they would eventually be published nearly four decades later. They would have withered away in this cocooned state, if it hadn't been for his younger brother Javed, who prodded the long-retired Jamil, convincing him to dig up, rework and submit those fascinating tales for publication. Fortunately, my uncle found a literary agent and publisher who were not deterred by his advanced age and recognized the immense value of his writing.

When we help older adults tap into their creative potential, we can engender a new culture of respect for the creativity and intellect of our elders.

Further Reading:
  1. Gullette, Margaret Morganroth. Agewise: Fighting the new ageism in America. University of Chicago Press, 2011.
  2. Naqvi, Raza et al "Preventing cognitive decline in healthy older adultsCMAJ July 9, 2013 185:881-885.doi: 10.1503/cmaj.121448
  3. Park, Denise C et al "The Impact of Sustained Engagement on Cognitive Function in Older Adults", published online on Nov 8, 2013 in Psychological Science doi:10.1177/0956797613499592
Note: An earlier version of this article was first published on 3quarksdaily.com.


ResearchBlogging.org Park DC, Lodi-Smith J, Drew L, Haber S, Hebrank A, Bischof GN, & Aamodt W (2014). The impact of sustained engagement on cognitive function in older adults: the synapse project. Psychological science, 25 (1), 103-12 PMID: 24214244

Saturday, February 8, 2014

Curating Creativity

"For every rational line or forthright statement there are leagues of senseless cacophony, verbal nonsense, and incoherency."
                                                Jorge Luis Borges, "Library of Babel"
 
The British-Australian art curator Nick Waterlow was tragically murdered on November 9, 2009 in the Sydney suburb of Randwick. His untimely death shocked the Australian art community, not only because of the gruesome nature of his death – Waterlow was stabbed alongside his daughter by his mentally ill son – but also because his death represented a major blow to the burgeoning Australian art community. He was a highly regarded art curator, who had served as a director of the Sydney Biennale and international art exhibitions and was also an art ambassador who brought together artists and audiences from all over the world.



After his untimely death, his partner Juliet Darling discovered some notes that Waterlow had jotted down shortly before his untimely death to characterize what defines and motivates a good art curator and he gave them the eerily prescient title “A Curator’s Last Will and Testament”:
1. Passion
2. An eye of discernment
3. An empty vessel
4. An ability to be uncertain
5. Belief in the necessity of art and artists
6. A medium— bringing a passionate and informed understanding of works of art to an audience in ways that will stimulate, inspire, question
7. Making possible the altering of perception.
Waterlow’s notes help dismantle the cliché of stuffy old curators walking around in museums who ensure that their collections remain unblemished and instead portray the curator as a passionate person who is motivated by a desire to inspire artists and audiences alike.

The Evolving Roles of Curators

The traditional role of the curator was closely related to the Latin origins of the word, “curare” refers to “to take care of”, “to nurse” or “to look after”. Curators of museums or art collections were primarily in charge of preserving, overseeing, archiving and cataloging the artifacts that were placed under their guardianship. As outlined in Thinking Contemporary Curating by Terry Smith, the latter half of 20th century witnessed the emergence of new roles for art curators, both private curators and those formally employed as curators by museum or art collections. Curators not only organized art exhibitions but were given an increasing degree of freedom in terms of choosing the artists and themes of the exhibitions and creating innovative opportunities for artists to interact with their audiences. The art exhibition itself became a form of art, a collage of art assembled by the curators in a unique manner.


Curatorial roles can be broadly divided into three domains:

1) Custodial – perhaps most in line with traditional curating in which the curator primarily maintains or preserves art collections
2) Navigatory – a role which has traditionally focused on archiving and cataloging pieces of art so that audiences can readily access art
3) Discerning – the responsibility of a curator to decide which artists and themes to include and feature, using the “eye of discernment” described by Nick Waterlow

Creativity and Curating

The diverse roles of curators are characterized by an inherent tension. Curators are charged with conserving and maintaining art (and by extension, culture) in their custodial roles, but they also seek out new forms of art and experiment with novel ways to exhibit art in their electoral roles. Terry Smith’s Thinking Contemporary Curating” shows how the boundaries between curator and artist are becoming blurry, because exhibiting art itself requires an artistic and creative effort. Others feel that the curators or exhibition makers need to be conscious of their primary role as facilitators and that they should not “compete” with the artists whose works they are exhibiting. This raises the question of whether the process of curating art is actually creative.
It is difficult to find a universal and generally accepted definition of what constitutes creativity because it is such a subjective concept, but the definition provided by Jonathan Plucker and colleagues in their paper “Why Isn’t Creativity More Important to Educational Psychologists? Potentials, Pitfalls, and Future Directions in Creativity Research” is an excellent starting point:
“Creativity is the interaction among aptitude, process, and environment by which an individual or group produces a perceptible product that is both novel and useful as defined within a social context.”
Using this definition, assembling an art exhibition is indeed creative – it generates a “perceptible product” which is both novel and useful to the audiences that attend the exhibition as well as to the artists who are being provided new opportunities to showcase their work. The aptitude, process and environment that go into the assembly and design of an art exhibition differ among all curators, so that each art exhibition reflects the creative signature of a unique curator.

Ubiquity of Curators

The formal title “curator” is commonly used for art curators or museum curators, but curatorial activity – in its custodial, navigatory and discerning roles – is not limited to these professions. Librarians, for example, have routinely acted as curators of books. Their traditional focus has been directed towards their custodial and navigatory roles, cataloging and preserving books, and helping readers navigate through the vast jungle of published books.

Unlike the key role that art curators play in organizing art exhibitions, librarians are not the primary organizers of author readings, book fairs or other literary events, which are instead primarily organized by literary magazines, literary agents, publishers or independent bookstores. It remains to be seen whether the literary world will also witness the emergence of librarians as curators of such literary events, similar to what has occurred in the art world. Our local public library occasionally organizes a “Big Read” event for which librarians select a specific book and recommend that the whole community read the book. The librarians then lead book discussions with members of the community and also offer additional reading materials that relate to the selected book. Such events do not have the magnitude of an art exhibition, but they are innovative means by which librarians interact with the community and inspire readers.

One of the most significant curatorial contributions in German literary history was the collection of fairy-tales and folk-tales by the Brothers Grimm (Brüder Grimm or Gebrüder Grimm), Jacob and Wilhelm Grimm. Readers may not always realize how much intellectual effort went into assembling the fairy-tales, many of which co-existed in various permutations depending on the region of where the respective tales were being narrated. I own a copy of the German language edition of the “Children's and Household Tales” (Kinder- und Hausmärchen) which contains all their original annotations. These annotations allow the reader to peek behind the scenes and see the breadth of their curatorial efforts, especially their “eye of discernment”. For example, the version of Snow-White that the Brothers Grimm chose for their final edition contains the infamous scene in which the evil Queen asks her mirror, “Mirror, Mirror on the wall, Who is the prettiest in all the land?” She naturally expects the mirror to say that the Queen is the prettiest, because she just finished feasting on what she presumed were Snow-White’s liver and lungs and is convinced that Snow-White is dead. According to the notes of the Brothers Grimm, there was a different version of the Snow-White tale in which the Queen does not ask a mirror, but instead asks Snow-White’s talking pet dog, which is cowering under a bench after Snow-White’s disappearance and happens to be called “Spiegel” (German for “Mirror”)! I am eternally grateful for the curatorial efforts of the Brothers Grimm because I love the symbolism of the Queen speaking to a mirror and because I do not have to agonize over understanding why Snow-White named her pet dog “Mirror” or expect a Disneyesque movie with the title “Woof Woof” instead of “Mirror Mirror”.

Internet Curators

The internet is now providing us access to an unprecedented and overwhelming amount of information. Every year, millions of articles, blog posts, images and videos are being published online. Older texts, images and videos that were previously published in more traditional formats are also being made available for online consumption. The book “The Information: A History, a Theory, a Flood” by James Gleick is quite correct in using expressions such as “information glut” or “deluge” to describe how we are drowning in information. Gleick also aptly uses the allegory of the “Library of Babel”, a brilliant short story written by Jorge Luis Borges about an imaginary library consisting of hexagonal rooms that is finite in size but contains an unfathomably large number of books, all possible permutations of sequences of letters. Most of these books are pure gibberish, because they are random sequences of letters, but amidst billions of such books, one is bound to find at least a handful with some coherent phrases. Borges' story also mentions a mythical “Book-Man”, a god-like librarian who has seen the ultimate cipher to the library, a book which is the compendium of all other books. Borges originally wrote the story in 1941, long before the internet era, but the phrase "For every rational line or forthright statement there are leagues of senseless cacophony, verbal nonsense, and incoherency" rings even more true today when we think of the information available on the web.

This overwhelming and disorienting torrent of digital information has given rise to a new group of curators, internet or web curators, who primarily focus on the navigatory and discerning roles of curatorship. Curatorial websites or blogs such as 3quarksdailyBrainpickings or Longreads comb through mountains of online information and try to select a handful of links to articles, essays, poems, short stories, videos, images or books which they deem to be the most interesting, provocative or inspiring for their readers. They disseminate these links to their readers and followers by posting excerpts or quotes on their respective websites or by using social media networks such as Twitter. The custodial role of preserving online information is not really the focus of internet curators; instead, internet curators are primarily engaged in navigatory and discerning roles. In addition to the emergence of professional internet curatorship through such websites or blogs, a number of individuals have also begun to function as volunteer internet curators and help manage digital information.

Analogous to art curatorship, internet curatorship also requires a significant creative effort. Each internet curator uses individual criteria to create their own collage of information and themes they focus on. Even when internet curators have thematic overlaps, they may still decide to feature or disseminate very different types of information, because the individuals engaged in curatorship have very distinct tastes and subjective curatorial criteria. One curator’s chaff is another curator’s wheat.

Formal Education and Training in Internet Curation

There are no formal training programs that train people to become internet curators. Most popular internet curators usually have a broad range of interests ranging from the humanities, arts and sciences to literature and politics. They use their own experience and expertise in these areas to help them select the best links that they then pass on to their readers or followers. Some internet curators are open to suggestions from their readers, thus crowd-sourcing their curatorial activity, others routinely browse selected websites or social media feeds of individuals which they deem to be the most interesting, others may plug in their favorite words to scour the web for intriguing new articles.

Internet curation will become even more important in the next decades as the amount of information we amass will likely continue to grow exponentially. Not just individuals, but even corporations and governments will need internet curators who can sift through information and distilling it down to manageable levels, without losing critical content. In light of this anticipated need for internet curators, one should ask the question whether it is time to envision formal training programs that help prepare people for future jobs as internet curators. Internet curation is both an art and a science – the art of the curatorial process is to creatively assemble information in a manner that attracts and inspires readers while the science of internet curation involves using search algorithms that do not just rely on subjective and arbitrary criteria but systematically interrogate vast amounts of information that are now globally available. A Bachelor’s or Master’s degree program in Internet Curation could conceivably train students in the art and science of internet curation.

Q-Credit

In scientific manuscripts, it is common for scientists to cite the preceding work of colleagues. Other colleagues who provide valuable tools, such as plasmids for molecular biology experiments, are cited in the “Acknowledgements” section of a manuscript. Colleagues whose input substantially contributed to the manuscript or the scientific work are included as co-authors. Current academic etiquette does not necessarily acknowledge the curatorial efforts of scientists who may have nudged their colleagues into a certain research direction by forwarding an important paper that they might have otherwise ignored.

Especially in world in which meaningful information is becoming one of our most valuable commodities, it might be time to start acknowledging the flux of information that shapes our thinking and our creativity. We are beginning to recognize the importance of people who are links in the information chain and help separate out meaningful information from the “senseless cacophony”. Perhaps we should therefore also acknowledge all the sources of information, not only those who generated it but also those who manage the information or guide us towards the information. Such a curatorial credit or Q-credit could be added to the end of an article. It would not only acknowledge the intellectual efforts of the information curators, but it could also serve as a curation map which would inspire readers to look at the individual elements in the information chain. The readers would be able to consult the nodes or elements that were part of the information chain (instead of just relying on lone cited references) and choose to take alternate curation paths.

I will try to illustrate a Q-credit using the example of Abbas Raza who pointed me towards a 3quarksdaily discussion of “Orientalism” and an essay by the philosopher Akeel Bilgrami. Even though I had previously read Edward Said’s book “Orientalism”, the profound insights in Bilgrami’s essay made me re-read Edward Said’s book. The Q-credit could be acknowledged as follows:

Q-Credit: Abbas Raza --> The 2008 3Quarksdaily Forum on Occidentalism -->  “Occidentalism, the Very Idea: An Essay on Enlightenment and Enchantment by Akeel Bilgrami published 2008 on 3Quarksdaily.com and 2006 in Critical Inquiry --> Bilgrami identifies five broad themes in Edward Said’s Orientalism

The acknowledgement of information flux is already part of the Twitter netiquette. The German theologian Barbara Mack uses her Twitter handle @faraway67 to curate important new articles about history, science, music, photography, linguistics and literature. She sees the role of web curators similar to that of music conductors, who do not compose original pieces of music but instead enable the access of an audience to the original creative work. She says that “web curation is a relatively new field of dealing with information and good curation is an act of creativity which requires dedication and a keen sense for content.” She agrees that curators should indeed be given credit, “not only out of courtesy but to acknowledge their efforts of taking upon the challenge of bringing the vast information the web provides into a handy form for their followers to enjoy.

Twitter curators such as Barbara Mack use abbreviations such as h/t (hat-tip) or RT (retweet) followed by a Twitter handle to acknowledge their sources. Contemporary Twitter netiquette suggests that if curated links of use to followers, these should acknowledge the curators' efforts before tweeting them on.

One challenge that is intrinsic to Twitter (but may in an analogous fashion apply to other social media networks as well) is that each tweet can only contain 140 characters, which presently makes it very difficult to acknowledge the comprehensive curatorial information flux. If I decide to tweet on an interesting article about the philosophy of science, which I found in the Twitter feed of person X, the space limitations may make it impossible for me to give credit to all the preceding members of the information chain which had directed X’s attention to that specific article. The Q-credit system may thus be best suited for acknowledgements at the end of blog posts or articles, but not for social media messaging with strict space limitations.

The Future of Internet Curation

The area of internet curation is still in its infancy and it is very difficult to predict how it will evolve. Managing online information will become increasingly important. Even though such managerial roles may not necessarily carry the title “internet curator”, there is little doubt that managing online information in a meaningful manner is one of the biggest challenges that we will face in the 21st century. I am quite optimistic that we will be able to address this challenge, but the first hurdle is to recognize it.

Image Credit: The Librarian by Giuseppe Arcimboldo (1527–1593)

Note: An earlier version of this article was first published on 3quarksdaily.com

Selected Q-Credits:

            1. “The Cambridge Handbook of Creativity” (2010) by James C. Kaufman and Robert J. Sternberg --> Chapter 3 “Assessment of Creativity” by Jonathan A. Plucker and Matthew C. Makel --> “Why Isn’t Creativity More Important to Educational Psychologists? Potentials, Pitfalls, and Future Directions in Creativity Research” (2004) by Jonathan A. Plucker et al. in EDUCATIONAL PSYCHOLOGIST, 39(2), 83–96
            2. “Thinking Contemporary Curating” (2012) by Terry Smith --> Information about Nick Waterlow and still image from “A Curator’s Last Will and Testament
            3. Book review of “The Information” at Brainpickings -->  “The Information: A History, a Theory, a Flood” (2011) by James Gleick --> “Library of Babel” by Jorge Luis Borges as an allegory for the information glut

Saturday, January 19, 2013

The Writer's Secret Is Not Inspiration


An excerpt from Orhan Pamuk's 2006 Nobel lecture:


"The writer's secret is not inspiration – for it is never clear where it comes from – it is his stubbornness, his patience. That lovely Turkish saying – to dig a well with a needle – seems to me to have been said with writers in mind. In the old stories, I love the patience of Ferhat, who digs through mountains for his love – and I understand it, too. In my novel, My Name is Red, when I wrote about the old Persian miniaturists who had drawn the same horse with the same passion for so many years, memorising each stroke, that they could recreate that beautiful horse even with their eyes closed, I knew I was talking about the writing profession, and my own life.

If a writer is to tell his own story – tell it slowly, and as if it were a story about other people – if he is to feel the power of the story rise up inside him, if he is to sit down at a table and patiently give himself over to this art – this craft – he must first have been given some hope. The angel of inspiration (who pays regular visits to some and rarely calls on others) favours the hopeful and the confident, and it is when a writer feels most lonely, when he feels most doubtful about his efforts, his dreams, and the value of his writing – when he thinks his story is only his story – it is at such moments that the angel chooses to reveal to him stories, images and dreams that will draw out the world he wishes to build.

If I think back on the books to which I have devoted my entire life, I am most surprised by those moments when I have felt as if the sentences, dreams, and pages that have made me so ecstatically happy have not come from my own imagination – that another power has found them and generously presented them to me."

The complete Nobel lecture can be found here.

Image Credit: Orhan Pamuk by David Shankbone 2009, Via Wikimedia - Creative Commons license

Friday, September 28, 2012

Do Not Write Love Poems

Let us have a look at some of the best advice on writing that has ever been given to an aspiring poet. Rainer Maria Rilke (1875-1926) was one of the greatest poets of the German language. He wrote a series of ten letters to Franz Xaver Kappus, an aspiring poet, that were later published as "Briefe an einen jungen Dichter", or in the English translation as "Letters to a Young Poet".

In these letters, Rilke initiates Kappus into the mysteries of writing poetry as well as into the art of living a meaningful life. The advice given by Rilke is just as valuable today, as it was more than a century ago. Perhaps, in a networked world in which solitude has become a rare luxury and treat, I feel that Rilke's advice has become even more valuable.

I will present some excerpts of the letters in the original German as well as a translation into English.

The letter dated February 17, 1903 contains the following passage:


Schreiben Sie nicht Liebesgedichte; weichen Sie zuerst denjenigen Formen aus, die zu geläufig und gewöhnlich sind: sie sind die schwersten, denn es gehört eine große, ausgereifte Kraft dazu, Eigenes zu geben, wo sich gute und zum Teil glänzende Überlieferungen in Menge einstellen.  
Darum retten Sie sich vor den allgemeinen Motiven zu denen, die Ihnen Ihr eigener Alltag bietet; schildern Sie Ihre Traurigkeiten und Wünsche, die vorübergehenden Gedanken und den Glauben an irgendeine Schönheit - schildern Sie das alles mit inniger, stiller, demütiger Aufrichtigkeit und gebrauchen Sie, um sich auszudrücken, die Dinge Ihrer Umgebung, die Bilder Ihrer Träume und die Gegenstände ihrer Erinnerung.

My translation of this passage is:

Do not write love poems; try to initially avoid those forms that are too commonplace and ordinary: they are the most challenging, because it takes great strength and maturity to create something of your own, when you have to compete with so many good and even great predecessors. 
 So save yourself from these general themes and instead write about what your everyday life offers you; describe your sorrows and desires, your fleeting thoughts and your belief in some form of beauty - describe all this with heartfelt, quite and humble sincerity. When you express yourself, use the everyday items around you, the images from your dreams, and the objects from your memories.


Why is Rilke's advice so important? And why does it apply to all writers, not just to poets?

Many of us who try to write run into the highly prevalent and painful condition known as "Writer's block". Here is what Professor Wikipedia says about "Writer's block":


Writer's block is a condition, primarily associated with writing as a profession, in which an author loses the ability to produce new work. The condition varies widely in intensity. It can be trivial, a temporary difficulty in dealing with the task at hand. At the other extreme, some "blocked" writers have been unable to work for years on end, and some have even abandoned their careers.


There is no straightforward cure for this agonizing ailment, which can paralyze the mind and soul alike. In an aspiring writer, it awakens the desire to eat junk food, watch mind-numbing sitcoms and snap at all fellow primates that try to communicate with you.  Most of my Writer's block flare-ups occur when I try to right about lofty and grand themes, such as Love, Death or Justice. Whenever I sit down in front of my keyboard to start writing about such a profound theme, I think about all the wonderful poems, essays and novels that have been written about these topics in the past. My fingers are paralyzed, because I feel there is nothing I can add to what Goethe, Rilke, Eichendorff and Star Wars have already eloquently put in words.

But Rilke tells us that this is the wrong approach. We should focus on the tedious details of our lives. Let us face it, most of our lives are quite boring and ordinary, but these mundane and tedious details are what really define us and distinguish us from other writers. When I have to decide whether my sixth "How-to-be-a- Writer" self-help book should be either "Write Is a Verb: Sit Down, Start Writing, No Excuses" or "Vex, Hex, Smash, Smooch: Let Verbs Power Your Writing" or when I experience Schadenfreude that I made it onto the commuter train on time, but the person who pushed me aside earlier is left standing angrily at the platform. These mundane moments are much easier to describe, because they are truly my own experiences and when I write about them I am not burdened by the history of great writing. I do not know what self-help writing guides Shakespeare used, but I am pretty sure he did not read "Write Is a Verb: Sit Down, Start Writing, No Excuses" and I am also pretty sure he did not experience the Schadenfreude of getting on the train.

Once I start writing about these seemingly mundane topics, my writing is more sincere. I also realize that extraordinary ideas are derived from ordinary details. 

The complete German text of the letter can be found on www.rilke.de.

An English translation of the complete letter is available here.