Saturday, December 29, 2018

Orthorexia Nervosa: When Healthy Food becomes an Obsession


Illustration of Alice with the rabbit and the Mad Hatter at a table
Being healthy is good but being constantly obsessed with one’s health is neither good nor healthy. It is true that we often neglect health matters, especially in the modern day and age where we have more wealth and with it more immediate access to junk food dangerously paired up with a sedentary lifestyle. It is a sad and tragic irony that a billion of people are overweight and obese, while another billion is suffering from serious malnutrition in other parts of the world. The former attempt to shed their excess pounds through diet, exercise and pills, while the latter are trying to survive and make it to the next day.

Nonetheless, this health crisis ought not to be blamed only upon the quick and easy access to fast food alongside its quick and fast lifestyle that includes and embraces microwavable food. In fact, we are further driven and prompted by advertisement and our surroundings to consume more and to consume more often and that includes everything from electronic devices to unhealthy food options. The same way we instinctively grab for our smartphones, we munch and snack on chips, cookies and chocolate, the three deliciously dangerous and addictive Cs in our lives.

It is not a complete surprise then that we have wide-ranging and serious problems with obesity and that shockingly this begins and develops now at a much younger age than ever before. Nowadays, even children can develop Type 2 Diabetes and as a rule they suffer from more and more health problems that used to be applicable to adults only. It is also true that psychological factors aside, one of the main culprits here is our careless and constant food intake.   

As we are bombarded not only with information that is harmful to our psychological and physical well-being, we also face - in some cases too much - information about being and becoming healthy. In today’s world when technology allows and enables us to access an unprecedented wealth of information at our fingertips, this can come as a blessing but equally as a potential downside for some.

Indeed, we are often like pendulums that know not of moderation and may go from one excessive side to another. There are times where we completely neglect our health only to replace it later with an ardent obsession for it. While the previous – our neglect and avoidance of healthy behavior - is often mentioned, discussed and criticized in the media as well as with friends and colleagues, the latter – our obsessive drive for health - is not given much food for thought nor is there much discussion about its potential toxic effects on our health and well-being. But there ought to be.

In fact, not caring about one’s health is almost as equally bad as caring too much for it. In most of these cases, the root of the problem of both sides of the pendulum is psychological, namely neuroticism stemming from trauma, past experiences, drives and lifestyle. These neurotic tendencies may manifest themselves in different rather extreme forms. We only need to look at the media to notice how obsessed people have become vis-à-vis diets and dieting and that they nervously jump and hop on and off from one trend or bandwagon to another. Although some adjustment to diet is not a bad thing, the pendulum may swing from the Atkin diet to paleo and gluten-free fads, and we are often left in rather the same state and weight (if not worse) as we began initially.

I am not saying that taking care of one’s health, adjusting one’s lifestyle or watching what one eats and drinks is not important or essential for our well-being, but I am concerned with those for whom this is merely a substitute and a rather convenient foil and excuse for obsessive and compulsive behavior. While carelessness with health is generally frowned upon and obesity is often attacked up to the nasty and cruel point of shaming people, we let health-obsessed people off the hook. Well, until now at least.

They may be able to go below the radar but their fierce manners of adamantly and continuously searching for their health makes them a perfect candidate for unhealthy obsessions. Like gamblers they also are facing an addiction; however, society encourages them, turns a blind eye towards their excess and may even use them or hold them up as role models for others. But, in fact, they are suffering from a condition called Orthorexia.

Orthorexia nervosa is an obsession with one’s food intake. Essentially, at least psychologically speaking, it is not that different from anorexia or bulimia. With Orthorexia, people are obsessed with food intake and neurotically count calories while being constantly on the lookout for healthy food; they shun all types of unhealthy food as if they were the devil incarnate. Such constant and ubiquitous manners of thinking about food is exhausting and instead of providing bouts of energy, it can drain them both physically due to lack of essential nutrients as well as psychologically due to raging internal conflicts.

What are the symptoms of Orthorexia Nervosa? Like anything, even health needs to be taken with a grain of salt and while we should be concerned about our well-being, we should not be obsessed by it or think about it at every turn and moment of our waking life. Whenever, you see somebody talking non-stop about health or worse trying to preach to you about the shortcomings of your behaviors or lack of restraint, beware. I have known reasonable vegetarians in my life (I myself had been proud part and parcel of the club in my younger years) but then there are also the preaching and self-righteous vegetarians; the latter sound like posessed evangelical preachers who would like to send you straight to hell for your lack of concern regarding your health and that of animals.

But even following vegetarian and vegan diets has both its pros and cons. In fact, there are studies that show how strict or not properly attuned or calibrated vegetarian diets can wreak more havoc than provide benefits to the person in question. Imbalanced diets can lead to diseases and chronic conditions ranging from osteoporosis, cataracts, allergies as well as a number of mental health ailments and even certain forms of cancer.

Blanket statements that vegetarian or vegan diets are always good and healthy or that meat eating is completely unhealthy miss the point and lead to false perceptions as well as faulty reasoning. In fact, strict or severely restricting diets will rob the body of important nutrients and will also strain it and can lead to anywhere from negligence to poor physical and emotional health. Too much of a good thing can then suddenly turn into its unintended opposite.

I find the religious metaphor not unwarranted here. The drive for pure, right or perfect food is not unlike the puritan belief of becoming clean and pure by shunning any kind of evil influence or so-called devil’s work. People are constantly alert about possible temptations and the devil may then reveal himself as a chocolate cupcake or a slice of hot cheese-dripping pizza. There is also the unspoken and understood undercurrent that people who practice such “cleansing” behaviors are deemed or see themselves as morally superior, a kind of holier-than-thou attitude that often permeates or emanates from them and supposedly gives them the right and permission to preach to others, the perceived sick and infidels.

Some of the negative effects of Orthorexia is not just a severe limitation of necessary food intake and nutrients but people suffering from this condition may isolate themselves due to their beliefs; they may not engage in social events and gatherings due to the fact that people around them would partake of unhealthy food; conversely, they may even be rejected by their peers for upholding and executing such drastic and radical views.

Although Orthorexia is not (as of yet) an officially recognized eating disorder, there are ways of gauging its impact and influence through a simple test. Dr. Steven Bratman, who coined the term, has provided a set of questions to test yourself for this eating disorder. For example, if you are constantly reading nutritional labels or thinking about food throughout the day or are meticulously planning your next day meals in advance, or have lost, given up on or sacrificed the pleasure of eating and replaced it with cold counting of calories, then you may be prone to have or be in the process of developing this disorder.

Food should be not only providing nutrients to your body, but it should be much more. As a food lover, I very much appreciate its myriad nuances in our life and history. It is a cultural statement evidenced in the variety of ethnic food from different regions; it is also a social event that bonds and unites families and communities; it is most and foremost also one of the most pleasurable activities out there. We go out not only on dates but also eat out to celebrate important events and accomplishments. Not enjoying food or shunning, severely restricting and limiting it or even feeling consistently and constantly guilty about it are warning signs that you may be heading in the wrong direction of your quest for health.

Monday, November 12, 2018

The Death and Legacy of the Immortal Artist

Portrait of German philosopher with glasses and moustache


Friedrich Nietzsche undoubtedly a highly influential thinker and one of the most important philosophers of the modern age was and continues to be often misunderstood and maligned and more often than not mired in controversy. The controversy may occur at times due to his innovative and revolutionary ideas, but at other times due to influences and circumstances that have been completely outside of his control.

An example of very unfortunate historical circumstance would go back to his estranged sister Elisabeth Förster-Nietzsche. She had taken charge and was mainly responsible for the legacy of the great philosopher. As she was a fervent Nazi supporter and outspoken antisemite, many of her brother’s thinking and ideas were distorted to please the growing and usurping power of the ultraright. The latter pounced upon that opportunity to appropriate and misuse the great philosopher’s ideas to align with and mold into their abhorrent worldview.

This association with the Nazis not only distorted the views, but also turned Nietzsche into someone he was certainly not. Nietzsche, in fact, would be the first to stand up against the tyranny that the Third Reich came to represent and their interpretation of the Übermensch could not have been further from the actual ideas of the great philosopher. As a result, this link and association have not only overshadowed the brilliance of his great mind, but this effectively caused hesitation if not downright rejection of the ideas represented by Nietzsche.

The situation is somewhat different when it comes to the imposing German composer Richard Wagner. That his music is majestic and brilliant is beyond doubt, but the appropriation of his oeuvre via the Nazis created a long-lasting stain and wound, which is reflected in Israel’s current stance of refusing to publicly perform his works albeit not downright banning them. When it comes to Wagner, there is palpable antisemitism expressed in his writings and hence, unlike in the case of Nietzsche, the association with the Nazis may appear justified in certain terms. Put differently, there was much less distortion needed to match and align the composer’s ideas with the Nazi mentality.

My point here is not in any way to defend the racist ideas of the musician, but to simply iterate an observation about artists and their creation: sometimes we need to separate the man (or woman) from their work. I think that choosing to censor and not to play nor listen to the beautiful music of Wagner is an error. I cannot deny feeling guilty about enjoying and appreciating his music but let us keep in mind that there are many Jewish conductors, such as Daniel Barenboim for example, who end up publicly performing Richard Wagner’s music. Although I would prefer artists to be a model and an example unto others in their personal life as well, I can think of and tolerate possible gaps and discrepancies between the created work and the persona who created them.

By all accounts and purposes, we might see Charles Bukowski as a lowlife drunkard, but his poetry is exquisite and beyond reproach. In the realm of films and cinema, Alfred Hitchcock alongside many other great directors may have been difficult and tyrannical on the movie set, but he created enduring works of art. We should refrain from immediately jumping to conclusions and judging the work by its creator. Rather we should attempt to isolate the work whenever deemed necessary and behold and regard it as a separate entity onto itself.

To give an example, what if Bach – and this is merely speculation and most likely completely untrue – was not at all religious himself. Would that make his music less moving from a spiritual and religious perspective? We might feel disappointed; we might think it hypocritical, but the fact remains that his work is imbued with divinity or at least a divine feeling and inspiration. Equally, if my own musical God in the likes of the incomparable Ludwig Van Beethoven were to be stained with scandal, I would still put him on a pedestal - not because of his personality but because of his enduring and powerful music.   

This type of judging the artist as well as the work has found new precedents in the recent Me Too Movement where some great artists have fallen from their pedestals as a result. Yet here I must offer a certain relevant distinction and caveat. In the case of actors, the situation is slightly different as they thrive and live off their persona. We know that actors are playing roles, but since their work is visually represented, it becomes much more difficult to disassociate one from the other. When I listen to Wagner, I do not think of him personally but rather I imagine scenes inspired by his music.

However, this distancing effect cannot occur or is much more difficult to do when it comes to acting. Acting is a visual form of performance and is closely tied with the person despite the current feat and ability of changing physical characteristics and age onscreen through make-up and computer technology. The fact remains that I find it harder to disassociate Kevin Spacey the actor from his actual despicable deeds. Evidently, it becomes much more intense and complicated when the main character of the show is supposed to embody a strong father figure and role model as is the case with Bill Cosby and his Cosby Show. In that case, I cannot re-watch any of those episodes without feeling queasy deep inside my stomach.

Yet the effect of disassociation would be much easier to do with directors who generally stand behind their camera and who are less visual to us. Although I strongly condemn the perpetrated actions of Woody Allen and Roman Polanski, I can still appreciate and value their works of art despite those implications. Their works are indeed personal expressions and extensions of themselves, but the films become transcended in the process. Their work can then be appreciated on its own merit without the presence of its creator. In fact, often works of art somehow manage to become independent of the author and gain a life of their own and this I can respect in the greatest artists despite their flaws.

To give another and rather offbeat example of visual representation is the moustache stub. In that case, it is not Charles Chaplin’s face that comes to mind, but Hitler’s. As a result, it is anathema to use it without risking dangerous association and implication with the Nazis. The same can be said of the Swastika, which was initially an Indian symbol of purity and healing until the Nazis soiled it forever with their dirty hands.

But let us keep in mind that Hitler himself was a vegetarian. This is very uncomfortable, but few people associate this with him and see their alimentary choice as a noble one. With the exception of very few vegetarians who opt for draconian measures of implementing and forcing their views upon everyone else, the act of being a vegetarian has little impact or connection with Nazi ideology. The main reason for this is that it is not visually connected with it and not fresh or salient in our minds.

To go back to the first cases of Nietzsche and Wagner, we can see how it was much easier to disassociate the philosopher from Nazi ideology than it was with the great composer. Music in its representation is more forceful because it is auditory. The Nazis actively listened to and used his works in their operatic spectacles, which wily-nilly created a closer tie between the two. The fact that Hitler proclaimed Wagner to be his favorite composer does little to help in this matter. Ideas, on the other hand, are more abstract and have been somewhat easier to dissociate from their grip.

Finally, as a closing thought, we should not forget that people not only live in but are also a product of their epoch and that almost nothing just appears nor is created out of a vacuum. Put differently, the times must have been to an extent promoting or encouraging certain trends and movements. For instance, during the fascist reign, science was actively interested in and investigated the use and effects of eugenics. This is also a time where brave Aldous Huxley presented us a world divided and controlled by genetics.

Along a similar vein, views of women and particularly sexuality have strongly influenced how people thought and acted in their specific times. Although I believe that great minds can surpass their own times, look at Nietzsche for instance, we should not avoid judging people retroactively and with hindsight. Yes, it is troubling and leaves a sour taste that someone as enlightened as Thomas Jefferson would still keep his own slaves, but that should not diminish the power and force of his message.

Sunday, September 16, 2018

The Fall from Grace: Garden of Eden Revisited


Adam and Eve depicted in the Garden of Eden surrounded by many animals
I have always found the biblical story of the Garden of Eden to be puzzling and confounding, to say the least. Questions have abounded regarding its moral lesson and utility. The story seems to suggest that original sin originally came into existence as the direct result of disobeying (admittedly blind and overbearing) power and authority; worse, since the act involved the dichotomy between ignorance and knowledge, the Bible seems to suggest that the former is preferable to the latter, hence delivering a primordial message of ignorance being bliss. Did God really want to us to live and be stuck in the shadowy realm of ignorance?

Add to that, the copious amount of misogyny thrown in as well as thrown at Eve, the mother of all living who is blamed for the ultimate form of temptation, i.e. knowledge and understanding, and one can only scratch or better shake one’s head in profound disbelief, if not utter astonishment at this biblical tale.

That is why until most recently the Gnostic reading and interpretation of Genesis seemed to be more reasonable and much more in line my acceptance and liking. It was the serpent that spoke with the voice of reason, whereas God’s (over)reaction spoke volumes about his fear of humans one day equaling (or even surpassing) him. This may be the main reason why he not only banishes Adam and Eve from his realm, but even puts a cherubim with a flaming sword to protect the tree of life lest humans become immortal too.

Yet when I stumbled upon Erich Fromm’s interpretation, it shone much needed light upon the hitherto dubious beginning of humanity. This all goes back to a concept of God that is overlooked and misunderstood in the Christian view. 

God is embodied as perfect and static. With it goes the mainly cerebral definition (might I say limitation) that everything that is perfect has already reached its full potential and cannot ipso facto improve in any discernible ways whatsoever.

In that sense, the most perfect state would be one that is utterly and completely dead, namely death seen from a strictly materialistic and nonspiritual angle designated and determined as the endpoint and cessation of any forms of consciousness. A stone would then be the most perfect of all beings having reached the stage of being perfectly static and immovable.

But if anything, the Bible shows us that God’s heart alongside his will are not made of stone. He is volatile and fluctuating; he is angry and forgiving; he is loving and cruel; he is at times merciless and at other times full of mercy. And if his very own statement and discovery to Moses were translated as “I am that I am” and taken at face value, it would entail that we are confronted with and praying to a rather intentionally and purposefully conflicting, contentious and confusing power and being.

Yet if we consider God not as statically and immovably outside of time but rather on a point on the plane of evolution, then God might lose his eternally fixed constant of always being or rather always remaining who he is, but he shall then become who he shall be, which could then continue and be prolonged eternally to time immemorial.

This might be a possible and closer translation to the actual meaning of his translated and interpreted comment. If seen as “I am who I shall be,” there is room for the possibility of change and improvement and a certain drive for perfection within divinity itself. If read in such a way, we see evolution and evolvement not only within humans but reflected within God himself as we were made in his image, the same way he is in ours.

If people object to praying to a god that is not already perfectly formed but like his creation strives for perfection on a higher plateau (a view not incompatible with the Buddhist concept of the universe), then one might ask oneself why it would be preferable to worship him as a seemingly emotionally unstable entity. Indeed there are countless moments of anger and fury, where he is controlling and impeding his creations; yet over time he begins to form a loving bond and relationship with humans and shows his greatest sign of love by offering and sacrificing the Son of Man or by making himself Flesh in order to sanctify all human beings and provide them with the necessary divine spark, not unlike the fire of Prometheus in Greek legend.

Such a reading of the Bible would explain why God is initially suspicious of his creatures, as there is a fundamental lack of trust and love and there is not a relationship per se between him and Adam and Eve; yet God manages to change and adjust his point of view.

In return, Adam and Eve did not have much rapport neither to God nor to themselves. Erich Fromm points out that both Adam and Eve did not know who they were and that they lived in a complete state of natural primordial harmony. Our earliest ancestors must have lived similarly as they were and saw themselves as an inseparable part of nature. Yet it all had to come to an end so that growth and evolution could manifest itself.

This could be symbolized with Adam and Eve eating from the tree of good and evil. Suddenly, they lost all touch and contact with nature and were left on their own. They became aware not only of their separate identities but more importantly of their loneliness. Suddenly they stopped seeing themselves as one with nature, and they saw each other as perfect strangers.

Unadulterated paradise existed no more and each had to survive on their own. There was no love yet between them, but only shame, embarrassment and guilt. Adam does not act out of love but out of spite when he blames Eve for the transgression. He tries to save and salvage himself. He has completely forgotten that Eve is part of him and his equal since both are part and parcel of nature and the harmony around them.

The fact that God created Eve out of his rib has given to erratic speculation and the faulty and irresponsible conclusion that therefore he must be superior and she inferior to him. This kind of conclusion is misguided and harmful and is too focused on a literal meaning of an unnecessary detail. What is more important here is the fact that she is created and taken out of him, his body and mind, and that Adam without Eve is incomplete, not unlike Plato’s androgynous being as described in the myth of Aristophanes.

The rib or rather rib cage is meant to protect and support two of our most vital organs, the heart, which is an evident symbol of and stand-in for love as well as the lungs, which regulates and controls respiration, the very breath of life. Furthermore, bones are indestructible life since they persist longer than the flesh. If we conceive of Adam as flesh, then Eve is the bone, the physical and spiritual support of the body. In such a manner, man and woman complement and complete each other.

The same idea is, in fact, expressed by Adam himself when he states that Eve is the “bones of my bones and flesh of my flesh.” It is thus that they shall be One flesh, and both continue to be like God as he created them in his image and likeness. To separate one from the other or to perceive one as essentially different from the other in terms of spirit, love or intellect would be a faulty and misguided interpretation then.

In addition, there might be also a case of mistranslation, since the word rib may possibly have meant “side.” That means that God did not take Adam’s rib but half of his side so that woman would be be-side man, not beneath nor above him; they would be side by side and perfectly equal.   

Before the supposed act of rebellion, Adam and Eve are indeed in a world of pure sensations or rather what Freud would term primary process. Eden is a paradise in which all beings and animals are one and communicate with each other and live in perfect harmony. This is akin to the world of the infant who would have had his needs met in the womb and who comes into the world blind to the outside world still feeling strongly connected and attached to his or her mother.

The moment that this idyllic situation experiences a rupture is the growing awareness of the outside world in terms of other people, objects, and food. This world is explored primarily through the mouth of the infant and through basic sensory experiences, including taste, smell and temperature and corresponding feelings and associations.

It comes as little surprise then that the outside world would be represented by a tree that contains fruit or apples. It is through the physical ingestion of that so-called forbidden fruit that knowledge is gained. Suddenly the perfect harmony is in disarray and Adam is disconnected from this and begins to feel separate and lonely from nature as well as from other beings.

In fact, what he feels for Eve is not nor can be love since the first thing he does is to justify himself before God by accusing her of having incited him. This is also connected to the sudden realization of not only physical nakedness but rather a feeling of shame that is associated and strongly tied with it. It is the budding of sexual instincts, not in the form of spiritual or romantic love union but merely as a primitive instinct or drive.

What stands out between them is their pronounced and visible difference. There is a selfishness or self obsession that drives and separates one from the other and it may be conceived as the growing pains of giving birth. Adam and Eve, God’s first children are not led by rules any more, by admonishments in the forms of don’ts, but they shall acquire moral insight in how to be and how to become, that arduous but eternally rewarding path towards morality and goodness. 

In that moment, in the very act of rebellion, humanity has taken its first stand, or rather it is the first time that humanity stands on its feet; now it needs to learn to walk, and, more importantly, love each other and its Creator. 

Saturday, September 1, 2018

Narcissistic Mothers


Painting by Ambrogio Borgognone from Rijksmuseum
The mother, nourishing and taking care of her infant, is upheld not only as an ideal for human development, but also considered a sign of spirituality: a mother is often depicted as an embodiment of the sacred and the holy, which is most expressly symbolized by the Holy Virgin of the Catholic religion. In fact, motherhood is put on a pedestal as she is seen engaged in the selfless act of not only protecting her loved ones but also imbuing them with unconditional love. 

Motherhood is equally reflected in the symbol of the Earth as providing protection and nourishment for its inhabitants and used as the symbol of the fertile land or of one’s home in its purest state. In the Bible, this is referred to as the land where milk and honey flows freely. As Erich Fromm points out in his seminal book The Art of Loving, this is the symbol of the mother taking the flock under her wing. But it is important to underscore that ideally each mother would provide both milk AND honey to her offspring.

Giving milk is triggered automatically and naturally within the body via breastfeeding and despite modern artificial and less adequate forms of nourishment (formula milk as milk replacement), it is still what most mothers initially provide to their hungry infant.

Humans as a rule are genetically predisposed to instinctively feel warmth and love towards a child, but this feeling tends to be more pronounced in the mother. She will give milk and nourishment to her helpless infant and only the cruelest and most resentful of mothers would deny providing this to their dependent baby.

But as man (and woman) cannot live by bread alone, the child will need more than milk. This is where honey comes into play. This stands for the sweetness of life. It is a mother’s twofold responsibility towards her child to provide not only the basic amenities but also sweetness, meaning an abundance of joy for life as well as spiritual satisfaction and fulfillment. However, most mothers fall short on this second aspect, which causes a wide range of problems within the growth and development of the child, and which spills over and carries on into adulthood.

This so-called lack of sweetness is most pronounced in neurotic individuals, and it is a condition that is most promulgated and exasperated by narcissistic mothers. On the surface, these mothers may appear to be a beacon of the perfect mother and being a narcissist, mothers of this kind enjoy both being the focus of attention as well as having the infant in its most helpless and dependent state of his or her life.

Gladly, they take on the role of the giving mother by providing milk as they see the infant as a reflection of their own ego. For a while, narcissistic mothers become the center of attention among family members and friends, and they relish in that feeling as they lavishly soak up each and every pore and aspect of this situation.

Humans, unlike other animals, are in a rather prolonged state of helplessness and dependency, and they need their parents, especially their mother for their survival. In fact, infants are born practically blind. There is no other world for them expect that of the mother with whom they feel united and unified.

Infants tend to feel that they are still in the womb connected to their mother through a now invisible umbilical cord and, in fact, their mother is not only their first contact with the world, but she is also the first love relationship in their life. 

Yet after some time, infants not only become aware of the outside world as separate from themselves, but they notice a new budding identity that feels separate and distant from the mother. This is a period where moments of separation from the mother can create intense feelings of anxiety within the child. In their minds, they fear that the mother has abandoned them and left them to their own devices, which from an evolutionary point of view would signify certain death.

The narcissistic mother still enjoys that stage of development, but she becomes aware and preoccupied that the power she has held on and wielded over the child is slowly beginning to diminish.

Soon after the child becomes more and more independent, the mother that provides honey will not only accept that growing separation and independence, but she will actively encourage it and help loosen - and later sever - the bonds of motherhood, namely to cut the invisible umbilical cord that still emotionally connected the child to her.

Yet that is an unwanted stage and anathema to the narcissistic mother. First, she would lose her standing and position as the center of attention from both family and friends. Gradually, her child is also gaining and creating some distance from her. This arouses intense anxiety and frustration in her as her projected role of motherhood as a caregiver is falling to pieces.

Since her love for the child is neither authentic nor genuine (let us keep in mind that narcissists are generally incapable of loving or feeling empathy for others, not even themselves for that matter), and as her love and care are merely an expression of her wish to control and have power over the helpless child, she will try her utmost best to stifle the growth and independence of her offspring.

In fact, what happens afterwards with narcissistic mothers is a case of neurotic “unselfishness.” This supposed unselfishness is, as Fromm points out, not in the sense of love and caring for others but more a manifestation of the hidden symptoms of depression, tiredness, and failure in the mother’s own love relationships. The so-called unselfish mother will claim to not want anything for herself and will make others (and sometimes herself) believe that she is only living for others, that is, for her child and children only.

Such unselfishness, were it meant as a true manifestation of unconditional love, should create happiness within the given individual, but the fact remains that the narcissistic mother does not feel happy at all; quite to the contrary, she feels unhappy, sad, angry and resentful with life in general and her lot in particular. These people are indeed paralyzed in their capacity to love or enjoy anything, themselves, their family or their children.

What lurks behind this façade, appearance and demeanor of unselfishness is, in fact, an intense self-centeredness. Narcissists see themselves as and continuously crave being the center of attention, and this is exemplified in their supposed sacrifice (of time, money, resources, and energy) for their children; they relish posing as and even complain about being a victim or being victimized by their constant and never-ending state of motherhood. We can see how and why physical but worse emotional independence and separation demonstrated by her children can cause distress and displeasure within such mothers.

Moreover, children who are supposed to benefit from this supposed sacrifice of their mother are, in fact, not happy but rather traumatized by this situation as they grow up in a toxic environment to begin with. These children tend to be anxious, tense and afraid of the disapproval of their mother and try hard to live up to her expectations; yet to no avail as she will never be satisfied with others or herself. 

Children raised by narcissistic mothers feel stifled in their personality and individual expression, while, in many cases, they do not manage to shake off the bonds and tight grip of the domineering and possessive mother. Even in adulthood, they may not only hold onto the need for having their mother’s emotional support and guidance, but they often project those same qualities upon their own partners and spouses.

In fact, a selfish mother in contrast would be much better for one’s psychological health and well-being because the narcissistic mother’s unselfishness works like a protective halo around her. While you can criticize the selfish mother for being careless and for not catering to the needs of the child, the same is much more difficult to be said or done when it comes to the “unselfish” mother; the child feels both conscious as well as subconscious guilt towards her and does not or is reluctant to give or utter any kind of criticism whatsoever. Since the narcissistic mother does not love herself, she is equally incapable of giving love to her child, and this trauma reaches out and continues far into the adult life of that person.

A narcissistic mother is trapped in a time bubble of when she felt most needed and wanted and when her children were merely the object of her power and control. She will do her best to stop their emotional and mental growth and the fact that she does not want what is best for her child but rather what is most convenient for herself shows not only deep-seated self-centeredness but worse a hatred of and contempt for life in general. Not only does she live her own life without honey, but she has also none to give to her offspring so that they must look for a replacement via other means.

Saturday, August 18, 2018

Purely Psychosomatic: It’s All in Your Head Book Review



White Book Cover with Cracked Egg
The book It’s All in Your Head: Stories from the Frontline of Psychosomatic Illness by Irish neurologist Suzanne O’Sullivan was first brought to my attention by my attentive wife about a year ago. She had found out about this book online and told me about it as she, being in the medical sciences herself, knew me to be a (borderline) hypochondriac. Previously, she would almost always say to me that my problems and ailments were psychological in nature, and I would retort, somewhat angrily, that this was simply not true.

Of course, she was right. But when I stumbled upon this book in our local library, I could not resist to give it a read. Browsing through the pages, I felt a bit discouraged because it was divided by patients’ first names; my first assumption was that it would be merely a collection of case histories and medical accounts and that it would be somewhat dry and boring and, what’s worse, perhaps not very useful to me personally.

And for the second time, I was proven wrong. It is refreshing to have a book like this written by someone who has had extensive and significant work experience with psychosomatic patients and who was firmly grounded in the medical sciences as a practicing neurologist. Apart from interesting and relevant background information about the history of psychosomatic illness via renowned neurologists like Charcot, Janet and, of course, Freud, she also provides up-to-date neurological research on the topic.

Psychosomatic illnesses tend to be purely psychological in origin. These are cases where there is ailment and suffering in the patients but no medical cause can be found or established. Tests and scans turn out negative, meaning that the problem has its origin in the mind. In the past, cases like these were delegated to the condition of hysteria. This would be, in its more extreme cases, paralysis of limbs, although physically there was no discernible damage to muscle tissues; the body was healthy and should have been functioning properly and well.

The ancient Greeks coined the term hysteria mostly due to the fact that many women were afflicted with the condition; hence, they (erroneously) believed that it was the uterus traveling to different parts of the body that caused ailments and diseases in those specific parts. Yet at the same time, Greek physicians like Hippocrates considered hysteria to be an organic disease, namely a disorder of the body and not of the mind. Thereafter, in the Middle Ages, hysteria was equated with witchcraft; it was believed that the condition implied that women were possessed by the devil.

It was not until towards the end of the 19th century that the medical profession took up the study of hysteria again and when Freud and other physicians showed how these patients could be led to lose the function of their body parts merely with the power of suggestion, that is through the act of hypnosis. These patients (most of them were female but there were males afflicted of the same condition as well) had come to truly believe that they were paralyzed. It was, as Freud demonstrated, due to the unconscious part of our personality since it had a certain amount of control over the body without our knowledge or awareness.

It is important to note that people who suffer from psychosomatic disorders are not imagining or inventing illness; they are indeed suffering and need help. They are also more common than you may think as O’Sullivan estimates that anywhere between a third to half of all patients that come to consultations at medical clinics tend to have underlying psychosomatic issues at heart. 

That means that up to half of the patients a family doctor sees on a daily basis do not have physical problems and hence cannot be cured via physical means, such as medication or surgery. Unfortunately, doctors either do not believe in psychosomatic illness, due to their own prejudices and / or medical training or, and this might be more likely the cause, they feel that patients would not be satisfied and even reproach the doctors for such a diagnosis.

The reason that medical treatments end up working for certain patients suffering from psychosomatic illness is merely due to the placebo effect. The patients think they are taking medication that would help them, and this might calm their fears and anxieties to a certain extent and degree. But it does not treat the underlying psychological cause for which a psychologist or psychiatrist ought to be consulted.

But most of us are reluctant to accept that particular diagnosis. There is a stigma attached to mental health that is harming us and preventing many of us to seek treatment when it would turn out to be essential and vital for our mental health and well-being. While we would not look down on people with so-called “real” diseases like cancer or Alzheimer’s or multiple sclerosis, we tend to see mental health issues as something minor and less “real” and oddly enough as one's own fault.

I have often heard many a misguided suggestion given to insomniacs and people suffering from depression. No, you cannot merely "snap" out of it: you cannot just close your eyes and fall asleep, the same way you cannot just smile and be happy. This kind of advice makes the afflicted person feel even worse, and in addition to their suffering, they would feel guilty about it. They would blame themselves and consider themselves responsible for not being able to “snap” out of it, and they would also be less likely to seek the help and treatment they need.

It is important to note that people with psychosomatic illness do not invent or make up their ailments. In fact, this has been scientifically demonstrated. In an experiment, people were told to pretend to have paralysis, that they should try hard not to move their hand, for example. As they were doing so, electrodes attached to their scalp looked at their brain activity. Those who merely pretended to have paralysis had a different part of their brain light up as compared to those who had psychosomatic illnesses.

It shows us that what is happening to these patients is not within their will and control, but it is controlled by the unconscious parts of their brain. How can psychology affect the body in such drastic ways? How can we possibly be led to believe that we have seizures or paralysis when there are no physical causes for them?

And yet, the evidence is visible to all of us. Imagine you have a public talk to give in front of hundreds of people. How do you feel? You are most likely sweating, your hands may be cold, clammy, or trembling; you may breathe rapidly; your blood pressure as well as blood sugar could go up; you might feel light-headed, and the list of symptoms goes on.

Oddly enough, all this happens despite there being no physical or tangible threat in front of you; this is merely nervousness caused by a possible sense of embarrassment all of which are played out in one’s imagination. If a relatively minor stressful situation like this can cause such physical symptoms in a person and if it is quite difficult to reign in or control those physical responses, how much more could be going on when there is significant psychological trauma within that person? It is certainly not something that can be solved at the wave of one’s hand or the flick of one’s fingers.

Psychosomatic illness is real, and it shows us the power the mind has over our body. It also underscores the importance of one’s mental health and well-being. Most of us do not even realize - myself included - that our symptoms are psychological in origin, and we continue having unnecessary and ineffective treatment that are meant and deemed for biological and physical conditions.

The book gives many examples of seizures, for instance. It is not always due to epilepsy. Through brain scans and video monitoring during seizure episodes, neurologists can often discard that option and determine that the ailment does not have a physical cause. However, many such patients take (and often even prefer taking) medication that comes with potentially serious side effects and complications instead of getting to the root of the underlying problem or health issue.

This book is enlightening especially since we are living in a time where mental health is not given its due and place. But it should be. Fears, anxieties and trauma associated with living in the modern world can take its toll on even the strongest one of us. We need to be open and honest about such issues and not hide behind a fake and pretend mask or a face of toughness. 

We should also treat other people that are suffering from mental health issues not with disrespect or prejudice but with care and concern. And when necessary, we need to push and guide them (and ourselves) to see past inherent prejudices and seek the treatment that is needed.

Tuesday, July 31, 2018

The Immaculate Celebrity State of Being



Photo of famous actor and celebrity
I recently watched Phantom Thread (2017) by Paul Thomas Anderson, one of those few filmmakers out there who is creative, idiosyncratic and generally unpredictable. He first impressed me with the magnificent and opera-like Magnolia (1999) only to follow it up with the small indie-like film Punch Drunk Love (2002) that ran half its running time with, of all people, Adam Sandler in one of his best and funniest roles. Phantom Thread is a different breed altogether, but I quite enjoyed this film in its own right.

Then I heard about what inspired this talented director to make the film: It involved a strong bout of the ‘flu during which he was taken care of by his wife. There might be nothing too exceptional or extraordinary about this situation, although ardent feminists might possibly criticize his wife for falling prey to the archetypical gender role of caretaker; yet for me something felt amiss and I felt queasy about this. And somehow it inadvertently lowered my esteem for this great filmmaker.

This might be partly due to his candid admission of weakness or helplessness. Although generally I do not see admission of weakness as disempowering - in fact, I think it’s quite the opposite, only the strong and confident can fully admit to and embrace their flaws and weaknesses – in this particular case, I would have preferred an instant of sudden epiphany as a spark of inspiration or something else that did not involve common day ailments. There was far less magic in the latter.

It was mainly the deception of finding out that he is, after all, a human being like everyone else, prone to sickness, disease, pain and suffering and not an otherworldly genius untouched by such ordinary issues. As someone who recently had his own unpleasant brush with a stomach ‘flu (and no I don’t think it was the mushroom pizza my wife made me), I can personally vouch for and sympathize with the grueling experience of sickness; but the problem is that we often assume, mostly subconsciously, that these god-like celebrities are not only beyond the common rabble in terms of talent and intelligence, but that they are also immune to the human foibles. Of course, they are not.

There is and most likely always has been a cult around celebrities. These famous people often reach a god-like status in our minds. In the past, it was mostly related to artists, musicians or other brilliant minds, so people might have swooned over Mozart, Beethoven or the booming voice of Charles Dickens and perhaps less over great philosophers like Nietzsche whose talents were barely recognized in his own time.

In our time with the advent of television and other forms of technology, celebrity status has become more ubiquitous. We see their faces plastered everywhere from checkout aisles to posters to billboards and other types of advertisement. One could easily pinpoint to the crushing waves of Beatlemania with young people fainting in front of their musical idols that sported terrible haircuts, which incidentally did not hinder but rather helped propel them to stardom. Yet this kind of swooning is not peculiar nor limited to the music industry nor the young.

This otherworldly aspect is evidently augmented through the use of television and the big screen as well as today’s smaller blue screens of laptops and cellphones. The faces of actors and actresses are now streaming on our gadgets and devices and we watch them play their respective roles, while they have reached quasi-mythical statuses in our imaginative minds. We infuse them with borderline supernatural powers and seem surprised, not to say shocked, to see that they are like us, prone to slips and weaknesses.

For instance, we are surprised to hear that some of them get angry, lash out at others or get into brawls, and that they cheat on their loved ones and / or lie to the public. They are often held to higher moral standards than politicians, who in some cases are celebrities themselves entering the realm of politics. 

All of this makes us lose sight not only of humanity but of reality as well. We assume that the Terminator can indeed end all the threats or that a cowboy actor president may shoot and eliminate the villains once and for all. Whenever flaws or weaknesses creep up, this wildly unrealistic and implausible image rips and cracks at its seams, and we either turn a blind eye or get angry and drop those celebrities like hot potatoes.

One of the problems with being in the limelight / spotlight (although my personal experience of this is very limited) is that you become visible to all. Hence, the preoccupation with one’s looks. As the faces on the screen remain fixed and immortal, the changing faces due to aging often become a major source of stress and anxiety for celebrities. More than any other group and people, they will try to preserve and hold onto the faces we have come to know on the screens and more often than not plastic surgery is used to rectify the (supposed) blemishes of getting older. 

It is somewhat easier for men as our culture tends to find them still sexy despite or in some cases because of their advanced age (I’m looking at you George Clooney pictured above), while women constantly battle the lines and wrinkles on their faces and dye their hair (except Helen Mirren, of course, and that is a good thing).

So much for physical aspects; let us look at the shadier moral parts and pieces. Here it does indeed become messy. As we expect them to be perfect not only in looks but also in demeanor and behavior, celebrities are held up to often unrealistically high standards. This is where the Me Too movement found its main drive and inspiration. The now unaccepted and always unacceptable behavior of male celebrities towards their female counterparts and females in general has brought everything quickly and successively into the spotlight.

And so it should be. However, part of this fallout is fueled by an anger towards celebrity figures because they failed to live up to our higher standards. And this is regardless and, in some cases, irrespective of their actual deeds, meaning its degree and intensity. The main culprit here is of course Harvey Weinstein who exploited women by using and wielding his power and authority in the movie industry. And many other celebrities fell in the wake of him.

Yet the problem lies in the fact that everyone is suddenly judged retroactively about their behavior. For example, in the past (and in some cultures still today) many deeds and actions were accepted or simply ignored, such as whistling at women or making inappropriate comments towards them. I do not condone that, but evidently, these deeds are not as serious and damaging as rape and sexual assault, which are crimes no matter where you are. However, during the zealous swirl (I am careful not to use the word hysteria) of the Me Too movement, everybody was thrown into the same pot or under the same bus; whether they had engaged in misdemeanors or crimes, it did not matter or the general public did not differentiate much.

Case in point is Morgan Freeman. He is or rather used to be well-loved and respected. There was hardly a blemish on his track record until news surfaced of sexist language he used towards females alongside inappropriate behavior, such as attempts at lifting skirts. Suddenly everybody started attacking him and his reputation took a great hit for things he had said and done in the past. 

Sure, what he did was wrong, and he did apologize, but the harm was done. But he is not a Harvey Weinstein, Bill Cosby or a Kevin Spacey for that matter. He is like many others and he engaged in what was or used to be generally accepted in this macho culture. The sitting president was excused for similar behavior referring to it all as typical locker-room banter, while Morgan Freeman was evidently not.

Another thing we tend to overlook is that celebrities are merely doing their job and those who are good, good-looking and / or simply lucky can also gain handsomely in the process. But it is, at the end of the day, a job and not much more. So you cannot expect an actor playing a doctor on television to operate people in real life, nor hire Perry Mason to take your criminal case or Inspector Colombo to solve a crime (replace them with more younger and hipper models and examples, if you wish). Nor can we assume (far from it!) that Bill Cosby is a good father or human being despite his projected image of a loving and caring person. These are roles that often do not correspond with reality.

We still buy it though. We even expect a golf player to act with moral standards even though they are just athletes who are good at their given sports. That is all. They are no role models, nor do they have the responsibility of being one. Rock musicians have it easier because we expect them to behave badly and when they do not, we get disappointed. Yes, Tiger Woods would have made a great rock star, but he (unfortunately?) chose golf instead and was then blamed for his actions.

There is a distinct form of hypocrisy at play here. We expect our celebrities to be who we think them to be. We mold them with the aid of media and publicity into archetypal figures or figments of our imagination. As long as they play the role, we are content with them. The moment they assert their individuality and are not who we thought they were, we feel angry and reject them. In these cases, it may be about them, but it also says a lot about us and our attitudes towards ourselves and others.

Saturday, July 14, 2018

The Quantum Metaphor for Life and Sciences


Two ways of seeing reality in a restaurant window
        Pasta or Pizza?
We can probably all relate to the following experience: There are five minutes left in a sports game (soccer, hockey, what-have-you) and your favorite team is ahead by a goal. 

You are anxiously looking at the clock hoping that your team is going to pull through with a win. A lot can happen in that five-minute interval, so you hold your breath. The seconds winding down feel like an eternity, and you wish you could move their hands more quickly to end the game and secure the much-desired win!

Now let’s switch and flip around the whole experience for argument’s sake. There are still five minutes left in the game, yet in this scenario your favorite team is behind a goal. 

Now anything can happen in that time interval as well, but the problem is five minutes that seemed an eternity in the first case now are flowing and flying by much too fast. You do not want to speed up time but would like to grab and tie its hands and stop it from moving further so that your team will be given enough time to score that essential and vital equalizing goal!

The constant of both situations is the time interval. In each case, we are allotting the same amount of time. Although time is relative, as suggested and proven by Einstein, it is still quite relatively constant and the same (at least on planet Earth) whether you are cheering for Team A or Team B. The only difference lies in our perception of time.

This, of course, is not merely limited to sports events. As a rule, any event that thrills us or brings us joy will make time fly and go too fast for our intents and purposes, whereas dreaded events seem to move at a painstakingly slow pace. The boring class that seemingly will never end; the work shift that is taking an eternity to wrap up and finish. In either case, objectively we are faced with the same amount and length of time, but subjectively, we experience time quite differently.

Yet our scientific view of things demands us to be objective in our observations. We say that regardless of the personal experience of time, the data that can be measured is exactly the same / identical for each scenario. That is a fact.

In the same vein, science needs quantifiable information: Today’s temperature of the weather is 25 degrees Celsius (or its equivalent 77 degrees in Fahrenheit). That may feel warm to you if you live in cooler climates or feel cool to you when you are accustomed to living in warmer and more tropical regions. Yet the exact measurable degree gives us and sets a benchmark to gauge the level of heat at that moment in time.

Or does it? This may take us to the medical sciences. There we have a disease that can be objectively diagnosed through specific tests, be it a blood test, urine sample or an X-ray. Based on the evidence, a person either has a disease or not. A doctor unlike an economist or even weather forecaster is not there to speculate nor to give us odds and probabilities whether a patient has a disease or not. We need scientific data or proof to corroborate the diagnosis.

The problem with this is that a given disease may be the same, but the personal experience of the disease is going to be quite different. Put differently, if a hundred persons have the exact same disease, its impact - that is the amount and strength of suffering, affliction, pain threshold etc. - is going to vary - at times rather substantially - from person to person and case to case. This experience, namely how ill the disease makes a person feel, is referred to as illness.

There are people who have a certain disease, but are not aware of it as they do not feel unwell, while others react to it rather strongly. This may depend on many factors, including the genetic, physical, and psychological make-up, the person’s life experiences as well as their ethnic and cultural background. No two people are ever alike, and their response to medication and treatment will also vary, which is why even medical sciences cannot always give us the clear quantifiable data we would like to obtain.

To complicate matters, there are many cases that are deemed functional neurological disorders or are diagnosed as conversion disorders, which are rather psychosomatic ailments that do not correspond nor can be traced to an organic cause.

People may suffer from pain or even paralysis in parts of their body without having a physical cause; rather their illness is stemming from often subconscious psychological issues or trauma. The book It’s All in your Head by neurologist Suzanne O’Sullivan, which has also graciously and inadvertently provided some of the background medical information of my post here, gives insightful and detailed explanations of such cases.

But for our intents and purposes, we want to suggest and highlight that certain scientific data should be taken with a grain of salt. I am not saying that we should consider the clearly ludicrous notion that the earth is flat (it is not). But in the past, learned people claimed with stern conviction that this was so, and they have been later proven wrong with science. Now this shows us that supposed certainty does not necessarily mean that one’s view is or will continue to be correct.

No better way to prove this than with quantum mechanics. Suddenly, we are faced with dilemmas in which our regular understanding of the world is shredded and falls to pieces. Is Schrödinger’s cat dead? Yes. But can it be alive? Yes. Is it possible for it to be both alive and dead at the same time? Um, yes, it could be in a zombie state until the box is opened, which is the only time we would know for sure. Are you sure about that? Absolutely.

This is a time where objectivity does not give us the distance that we need to define and verify events. Light can be both a wave and a particle depending on how you look at it; it is not an issue of P or not-P, but it can be both at the same time! In this case, the subject becomes so involved and enmeshed with the object itself that one cannot simply be without the other! Put differently, they are as interconnected and intricately linked with each other as space is with time in the indivisible form of space-time, which, after all, happens to be not linear but curved.

In these instances, our logic seems to go out the window, and we may come to the uncomfortable realization that time and everything else for that matter is nothing but an illusion. The objects and colors we perceive then are nothing but atoms that move sometimes more or sometimes less quickly. The absolute kind of truth that we expect of Newtonian physics as well as the razor-sharp stiletto of logic will have to take a backseat for a moment due to the discoveries of the uncertainty principle since electrons and atoms disregard those rules and laws.

But there is a way out of this entangled mess. As humans we have always been prone to adapt to our surroundings and as humble and open-minded scientists we are generally quick to assimilate and respond to constantly changing circumstances. This does not mean that our previous scientific knowledge and discoveries are wrong (they are not) but there is still a factor we have been queasy about and that is the element of subjectivity.

Any human being no matter how well-trained and accomplished cannot escape their own subjective viewpoints and biases. And let us not treat it as a negative thing but actually embrace it. Let us rethink science and not see it as distancing the object from the subject but combine both in a mystical dance, where I lose myself in the flower I am contemplating and examining, and I am the flower and the flower is me.

Let us use our subjective capacity and empathy to identify ourselves with the object in question instead of carefully extricating and distancing ourselves from it. Let us consider - as it has been occurring in psychology – the person that comes to consult the therapist less as a patient but more as a client or agent who can benefit from the doctor’s knowledge, the same way the doctor can benefit from this interdependent interaction.

This is what could be called the Quantum metaphor. One can apply this mystical uncertain certainty as a union between object and subject, interior and exterior, self and not-self to create a new perspective or paradigm of the world around us.

It can be applied to anything from sciences, philosophy, politics to religion as well as daily life. When there is no definite yes-or-no answer or truth, one can see the world with different eyes. There is no good or evil per se but often changing circumstances. An immoral act of stealing or lying may be justifiable and even commendable in certain situations.

Let us listen to the other, our supposed enemy or threat and see them not in the biased and one-sided Us vs Them mentality, but let us notice the common ground that we share despite our perceived differences. Yes, we can have a love-hate relationship with someone and that is not necessarily a contradiction in and of itself.

I am not merely saying that one should inundate oneself with positive thinking. This is not merely a glass half-full, half-empty metaphor. In fact, positive thinking can do us more harm than good in some cases. Nor am I talking about pure rationalism that would justify philosophical trends like utilitarianism where the benefit of the majority supposedly can override the suffering of the few.

The quantum metaphor would simply allow us to think of the world less in a divisive way; it is not just about me versus them or my self versus the external world but rather a unity where both joyously complement each other, where harm to my neighbor will, in return, harm me as well. The quantum metaphor would also help us curb our hubris and overreaching ambition in which we may allow ourselves not only to be wrong on certain matters, but to even contradict ourselves and our stern principles when the situation requires us to do so.

To exemplify this in another way, let us look at language and experience. For example, anxiety is something we try to avoid as we see it as a negative emotion and experience. But we would be wrong to do so. 

Anxiety not unlike pain is giving us signals that something is up and that this something needs our attention. It points us towards a problem or issue that exists within us. Instead of avoiding it, we should embrace it and follow it and see where it leads us, the same way we do not ignore pain as it is alerting us to fix a health issue in our body.

Equally, the adjective anxious can be perceived in two contradictory manners. I can feel anxious in its negative nervous sense or I can be anxious for something to happen as an expected thrill or as a sudden rush and onset of emotions. Or I can simply be anxious for my team to win with only five more minutes left in the game.

Tuesday, June 5, 2018

Education à la Carte: The Shift from Traditional to Student-Centered Teaching



Old-fashioned classroom with wooden table and chairs and blackboard
The Classroom not to mention curriculum and teaching methodology has changed significantly over the years. Class sizes with the exception of colleges that tend to cap the number of students allowed and admitted per classroom have grown exponentially in most academic institutions. Universities have for the most part gotten rid of those uncomfortable and inflexible wooden student writing chairs I used to find myself trapped in for the duration of the class; at the same time, a great deal of traditional writing utensils, such as old-fashioned pen and paper have also been eliminated and are now replaced with styluses, laptops and / or iPads. Electronic cellphones are in practically every hand, be they those of students or teaching faculty, and these gadgets have become an indispensable part of daily life, whether we approve of them or not.

The times they are a-changin’ and technology has affected us in often imperceptible and implicit ways. Yet it has had perhaps one of its most drastic consequences on teaching itself. Schools and universities are naturally more conservative in nature and tend to view trends with a suspicious eye; whenever they feel threatened or invaded by something, they simply ban it. They foment and thrive on fostering discipline and use the cloak of authority to enforce this most effectively. In the past, physical punishment was the norm for misbehaving students and often students were threatened with having parents notified or they would be served notice of possible expulsions from schools and universities.

Students used to have little to no say when it came to their education. They were not asked or consulted, and, in many cases, they were not even considered nor taken into account when it came to educational matters and methods. Instead, the school board and university management emphasized and aggrandized the role of their teachers. The latter were to be treated with utter respect and it went without saying that everything from curriculum to methodology was, ipso facto, teacher-oriented.

Instructors would proverbially preach from the podium and with hanging heads the students would jot down every pronouncement coming from the teacher’s lips as if they were the words of God.  Rote exercises and memory drills would ensure that knowledge of dates and details were imprinted indelibly onto the fresh and impressionable minds of the young. The exams would test rather mindlessly - that is with little creativity and even less critical thinking - what the students had retained from the previous lectures. Those with good or photographic memories would for the main part excel and they could impress with citations of facts and recite poetry without understanding or appreciating any of it.

The adage of knowledge being or posing as power was deeply ingrained into their minds, so students would vie and compete with each other to see who can cram the most amount of information into their spinning and overstuffed heads. Back then, there were not many ways of verifying information except through a quick visit to the local library. But today with the access of knowledge on the fingertips of almost every individual on the planet, facts can be quickly checked and verified or disputed; consequently, the teachers have lost some of their hegemony and autonomy in claiming to be always in the right. In other words, it has become easier now to prove one’s professors wrong and the latter could not just rest on their laurels nor nest on their diplomas but had to ensure now that their facts were indeed correct.

These technological changes alongside changing political and economic circumstances have shifted the focus and practice of education. While the student used to be perceived as an empty vessel that needed to be filled with knowledge, now we view them as autonomous human beings who, more importantly, have their share of rights. Teachers cannot physically punish their students anymore, nor should students be treated disrespectfully; in fact, students are to be seen as active as opposed to passive members in the educational exchange. Moreover, educators do not treat them as blank slates but try to delve into and even utilize their previous knowledge or build upon their existent skills.

Since encyclopedic knowledge has lost its status and value (few today truly and fully believe that knowledge is indeed power) and since information access has become rather commonplace within the new paradigm (essentially anybody can google information), the shift has moved from mere acquisition towards the application of knowledge. This is accomplished via critical thinking and it is a move away from content-specific information (who wrote what when) to skills and learning outcomes (what are the implications of the text in today’s world). What the education system is interested in nowadays is less whether students know something but how they can use and apply their knowledge in their own lives. This has led to the shift from a teacher-oriented to a (more) student-oriented approach.

All of this is constantly shifting and changing and sometimes it goes off course or even overboard. One of the latter instances is the use of a flipped classroom. In this case, it is all about the students and the teachers are stripped from almost all their authority and input and are considered facilitators of the learning process. This can be quite challenging not to say frustrating for students. It may be akin to the (mal)practice of throwing children into the swimming-pool and letting them figure out how to swim by themselves.

There are certainly some benefits to be attained from eclipsing the over-imposing and often interfering figure of the teacher and to give the students opportunities to collaborate and solve problems so that they can think in an active and productive manner. However, the flipped classroom is taking the thought to an utmost and extreme degree that cannot be the best nor the most efficient approach. One should not unduly restrict nor give too much leeway to the students but hold them responsible and accountable while guiding them with appropriate measures. The middle way generally proves best for positive results and there is rarely one single approach that can give the most dividends but rather an amalgamation of different styles.

In the same vein, I do not think that students and teachers should be on equal footing. In many cases, students are already too much in control when it comes to power and authority. This is especially so as higher education is now mainly a form of business and like any successful enterprise it would wish to or has been economically forced to provide their clients and consumers with what they crave most. Certainly, educational standards ought to be kept at a similar level, but these also have become more flexible and may even adjust to political strategies and structures of the times.

Students have earned their human rights, but they should not dictate the mandates of their own education. I disagree with students making up their own exams, deciding on the content of the course or generally doing as they please; students already have a shorter attention span and have problems with concentration and following instructions, so they should not be left to their own devices. Part of the reason why the traditional model of lecturing is not functioning anymore in today’s world is because their capacity to focus has diminished due to the use of various forms of technology and there is little to help them change much or to become otherwise.  

For this reason, educators need to adopt strategies that are best suited for the given outcome. In other words, I would not wholly eliminate the lecture as a source of teaching but would have it minimized and followed with active involvement and participation. It should not be all about the teacher nor should it be all about the student but there should be a healthy balance, the middle-way compromise among the two. What worries me is the danger of eroding respect for the authority, in this case, the teacher or instructor. Their relationship should remain and be on the formal and professional side. It should not be rigid but still have a firm grounding or foothold.

In fact, education should not be a restaurant where students can order whatever they please. It should have an established menu yet be open-minded and flexible to switch its ingredients or change the way the meals are served. One could adjust one’s teaching methods under what I like to call framed spontaneity to make room for improvisations and possible on-the-fly changes to one’s course content as well as delivery. Furthermore, one should not shun nor fear technology by prohibiting cell-phones, laptops or other electronic devices in the classroom but rather take advantage of them and use them for the benefit of everyone involved.

A blended class where classroom interaction is enriched instead of wholly substituted by technological tools would be the most ideal outcome. And in many of these cases, it is common that students know more about technological use and advances than the teacher, so this new knowledge can come in handy for the educator as well. This situation could serve, if used efficiently, as a veritable educational encounter and as a profitable interaction between students and educators, and it would ensure that learning continues to take place on both sides of the spectrum.