Sunday, 16 June 2019

Hyper-Vigilance - Time To Burst The 'Safety' Bubble

A recent study suggests pregnant women become “hyper-vigilant” towards the end of their pregnancy in order to keep their unborn baby as safe as can be.

The research from Anglia Ruskin University looked at peripersonal space – the individual sense space around a person – and tested how a woman reacts during her pregnancy. 

Scientists used audio-tactile testing to investigate how the part of the brain that is aware of personal space was affected as the pregnancy developed. They found that, while it was unchanged during the first two trimesters, the boundaries were expanded during the third trimester as the woman’s body stretched to accommodate her growing baby.

It is a fascinating revelation and it’s easy to understand why this would be the case; it makes absolute sense. The woman is on the watch for any danger to her developing baby; all her senses are on high alert; she is careful, wary and on the lookout for any threat. She has reverted to her animal instinct, using the part of her primitive brain – the amygdala – that deals with emotions and is responsible for alerting her to danger. 

So, for an expectant mother, being hyper-vigilant makes sense. There is a very clear point to it. 

But what about hyper-vigilance in other circumstances, when the need may not be there?

Hyper-vigilance was good for our ancestors who had to be ever-aware of their surroundings and what was going on that might be a threat to their survival. The amygdala was crucial to their survival skills. It gave them the instant fight, flight or freeze reaction that could mean the difference between life and death. 

It is useful still, as 96 Harley Psychotherapy’s founder Dr Robin Lawrence explained: “You're walking down the road reading your smart phone and in a world of your own. You’re not thinking about what’s going on around you and are about to cross the road, when, for some explicable reason you stop. And just as you come to a sudden halt at the edge of the kerb, a big red bus goes by. 

“If you’d stepped out, you’d have been badly hurt at the very least. As it is, you’re standing still with shock, your mouth’s dry, your heart’s racing and you’re not sure what’s happened. 

“But you’re all right. You’re alive. And that’s because of some deep-down warning system within – your amygdala – was was doing its job well and looking after your survival.”

So far so good, but now we come to the more difficult part. 

We no longer operate within the same world as our ancestors and, all being well, our survival skills should have expanded to incorporate a greater need for reasoning and understanding that we need to use in our modern world.

For this, we need the pre-frontal cortex, the part of our brain that works out the best reasoned and calm way to respond to a present situation.  

The pre-frontal cortex gets messages through the hippocampus – another key part of the brain connected with our emotions – the part of our brain that stores past memories and information and guides us towards our response. 

Interestingly, studies suggest people with anxiety problems – of which hyper-vigilance is one – have a smaller pre-frontal cortex than those living within a “normal” range of anxiety levels. 

The good news is that it develops with use so the more a person uses their reasoning and “in the present” thought process, the easier the process will become and the less reliance the individual will have on the amygdala and its impulsive response. 

Hyper-vigilance is believed to be connected with trauma and post-traumatic stress disorder. It is a reaction to something that was perceived to have been terrifying, out of control and perhaps even threatening to a person’s life. For example,  imagine a soldier who’s been under fire in battle and remains forever stuck in that heightened sense of awareness, unable to move from that terrified state into a place of reasonable normality. 

It may indeed have been the case at a particular time for an individual or it may have been a childhood recall – accurate or not – when one’s very survival did literally depend on another person. 

So, if a person is hyper-vigilant as almost a “default” position, their response to life may in fact be detrimental to them, the opposite of what they are trying to achieve. If they are permanently on the lookout for trouble, they will be forever reacting emotionally, impulsively or inappropriately because they have not worked out what is the right way to react for the situation that is happening in the present, at this moment.

Hyper-vigilance carried into adult life is not a good idea. It has the potential for reverting us to a child-like state where we are a being full of emotion but have lost the reasoning skills that help us develop into fully-fledged thoughtful and capable adults. 

So, while we admire nature’s ability to allow an expectant mother to use the temporary hyper-vigilance qualities she is gifted to protect her baby, we need to remember that, on a permanent basis, it is no way to live. 

By Lulu Sinclair


Image of pregnant woman by Mystic Art Design from Pixabay
Image of a highly alert meerkat by Manfred Richter from Pixabay  

Thursday, 30 May 2019

In Defence of the Millennials

I’ve been reading a number of articles over the past few months about the “snowflake” generation, the original term for the millennials.

The name was cleverly created by advertising folk to target people in the age range of 22 and 37 and, from that comparatively loose term, comes Generation Snowflake and the later-born Generation Z. 


From what I read - rows over university debates, "safe" spaces, climate change, veganism, you name it, they had a strong view on it - the "snowflakes" appeared to be young people who were very careful to want recognition of their own feelings and needs but would truck no argument about anything or anyone who disagreed with them. 

Put like that, what’s not to find irritating?  

Most of the articles* I’d seen were critical of the group as a whole and that for a start seemed odd. How can you generalise about a generation of young men and women and treat them as if they were all the same? Isn’t that disrespectful? Maybe it was time to look at issues from their point of view and to consider the views of the “other” and reflect on them in a different, non-judgmental way. 

So, considering for myself, I’ve been wondering what’s wrong with being a snowflake? Don’t we love the original snowflake, with each one that falls having an original unique design that will never be replicated? Isn’t that a wonderful analogy of a person? And if a new generation is aspiring to living a life that allows the “uniqueness of the individual”, why are we not applauding and backing them? Why are some of us sighing, pulling faces and trying to absorb them into our way of life, rather than thinking about their desires instead?

It seems to me that we in Britain have been forced into the “one-size fits all” box of industrialisation and we have forgotten what it’s like to be individuals. Our society requires parents to go out to work to pay huge mortgages, bills, holidays and whatever else is required for us in a capitalist society, and that means many children are being institutionalised at a very early age. There are nurseries taking babies from the age of three months. In some society, the idea of removing an infant of that age from the care of its mother would be seen as cruel. 

I'm beginning to believe it’s hard being a millennial. There’s little job stability - that gigging economy causes more stress than you know - and, if they’ve been to university, a lot of debt to pay back. Rent is hugely expensive and mortgages prohibitive so how you they get on the first rung of any ladder without outside help? Maybe the “bleating” that we older adults hear, has a point. Maybe it would be better for us as a society if we were more emotionally attuned to the individual and his or her needs, than we are at present. 

And perhaps, if we disagree with some of their views – I’m in favour of arguing a case, rather than banning my opponent for example – we should try to convince them that there is sometimes a case for reason over emotion or a way to incorporate the two. They, in turn, could teach us about being a little more compassionate.

It would be good if cross-generation conversation could be encouraged so that we gain from the wisdom of all; currently, my readings leave me feeling there’s a desire to create hostility between the age ranges, perhaps to stop us blaming our leaders for the mess in which society seems to be right now.

One of the first millennials I came across described himself as a teenager as a “post-modernist child”. I asked for the definition and he said: “I’ve seen my parents work very, very hard and not get the rewards they deserved and I’m not going to do the same. That makes me post-modernist.” 

The young man rowed back on that a little as he “matured”, but I still admire the original thought. 

I’m going to end this piece with a story about another then 22-year-old millennial working as a journalist who was offered a full-time job – they are very hard to come by – with a top media organisation. He turned it down. I asked him why and he said: “Because I didn’t like the way they treated their older workforce.”

How kind was that? A young man at the start of his career choosing to put his core beliefs ahead of his own personal ambition. 

Put like that, what’s not to like.

By Lulu Sinclair

* A point to be aware of is that newspaper feature articles are often put out there to be controversial and to bring a response from their readership which will vary depending on their audience. So, for example, a left-leaning newspaper  may be more sympathetic to the “snowflake” point of view, whereas the editor with a more right-wing stance may feel his readership would want him to take a more robust approach. 

Photo by Rob Sarmiento on Unsplash







Thursday, 16 May 2019

Busting The Drug Myth


With the appalling increase in incidents of knife crime in the UK in recent years, attention has again been turned on to the efficacy of drug laws, and the need to enforce a more punitive system for the possession and use of illegal drugs.

There has also been more focus on the incidences of cannabis-induced psychosis and fatalities from the use of the “party drug” ketamine.   

While the “Just Say No” campaign is now recognised to have failed dismally, (whether applied to drugs or sexual activity), it has left a residual climate of fear that has prompted the public and politicians to campaign to remove the threat posed by the consumption of drugs by prohibition, rather than by focusing on eradicating the extremely lucrative circulation of drugs by dealers whose trade depends largely on drugs being illegal. 

In recent research by Professor David Nutt, it was revealed that the categorisation of drugs (A-C) did not accurately correlate with the incidents of actual harm these drugs caused and, in fact, by far the most dangerous drug in circulation - both to the consumers themselves and to others - is the legal drug of alcohol.  

The stigma attached to illegal drugs is therefore disproportionate to the real damage caused by them and has created a fear-based response that is not only counterproductive in reducing consumption (prohibition was shown to increase alcohol dependency when applied in the US) but risks actually increasing violence as it actively plays into the hands of the drug cartels.  

The majority of the violence caused by drugs, is not in their consumption but in the safeguarding of the “patches” where they are sold. 

It is here that gangs are formed to protect lucrative deals and expanding these patches now involves the coercion of young children through the notorious “county lines”.   

It also leads to the contamination of drugs because they are cut with cheaper or more potent substances to make them go further and therefore increase profit margins.  

When America banned alcohol, the result was that more people became addicted to whisky than other drinks. The reason for this was because it was stronger in alcohol content than, say, beer so it was easier for dealers to transport, because smaller quantities were needed to create the same effect.
  
I am not in any way advocating the use of hard drugs or, indeed, the uninformed consumption of any drug (including prescription drugs which, it is now suggested, should carry a health warning) but suggesting funds should  instead be directed towards education and the management of access to drugs that can control both the quality and the quantity of drugs available, and offer appropriate support to those whose develop a dependency. 

This approach is already showing positive results at the Jobcentre Plus, in the London borough of Peckham where young people involved in the drugs trade are being coached and helped with housing and education in order to find work and become self-supporting.   

Just as not everyone who has a drink becomes an alcoholic, many drug users do not become addicted and will choose give up of their own accord when their circumstances change. 

It is now widely accepted that the dependency on drugs, as on any other addictive substance or behaviour, is more to do with the vulnerability of the person concerned than the substance or behaviour itself.Therefore, if we can work therapeutically with those who are susceptible, there is a greater chance that consumption will be reduced.  

It has been observed that in countries where drugs have been legalised there has been a significant drop in violent crime, and many more people - as in Peckham - have been willing to work towards making better choices for themselves.  

It is true to say that, since the beginning of time, man has felt the need at times to mood alter and the more deprived or disconnected the person, the more vulnerable they will be to these temptations. 

It is therefore likely that the poorer and more disaffected members of  society, (and this will, inevitably, include ethnic minorities), will be more likely to accept a “bribe” of cash or the promise of designer trainers, to escape the deprivation and hopelessness of their environment - and this results in further alienation and rejection.  

By this process, we are creating a sub-sector of society and then ostracising that sector for not fitting in.  

However, if we can accept that these people are desperate and deprived - rather than inherently violent and hostile - and work with them through support and education, maybe even with legally controlled provision of drugs where needed, we will begin to eradicate the worldwide violence that is endemic within the drug trade and, perhaps, start to make our streets safer places again.       



Thursday, 2 May 2019

When Living The High Life Gets You Down

The conviction of Anna Sorokin, the woman who conned New York’s high society into believing she was a German heiress, is a great story. If you’re a journalist, that is, rather than a psychotherapist or counsellor. In which case, it’s a very, very sad story.

Anna Sorokin – who went by the name of Anna Delvey – now faces up to 15 years in prison. She’s in her mid-20s and, if sentenced to the maximum term, the former lorry driver’s daughter will be saying goodbye to the remainder of her youth and, arguably, the best time of her life.

During the course of her New York trial, we learned that Anna invented herself through social media and amassed tens – if not hundreds – of thousands of dollars’ worth of debt in pursuit of her life of luxury. She was seen at the best parties; visiting the best places and having the best of times. And we know she was doing this because it was all fully documented via social media.

Except, while the events she attended were real, the person wasn’t. She was made up. And the money she spent was not her money; it belonged to others. 

So how did this happen? What sort of person is prepared to risk their liberty by living a lie “in plain sight”? What may be going on inside to feel the need to go to such lengths to be “out there” to be visible.

Anna, who was born in Russia and went to live in Germany from the age of 16, seems to have had a lot of nerve. She arrived in New York determined to live the dream and, as her defence lawyer explained: “Anna had to fake it until she could make it.”

I understand lawyers have to work with what they’ve got but I’m astonished at that defence and, in a way, that it’s no longer so astonishing. It’s almost looked on as an admirable quality, rather than one of which to be a little ashamed. 

Was Anna not simply doing what everyone else would have done if they could? Was she narcissistic with a sense of entitlement or was she delusional? Or was it none of the above, just a determination to be at the top of the capitalistic tree, where so many other people seemed to be having a good time. Could it be that our western society is collectively looking for a quick fix? 

In the comparatively few years since it started up, social media seems to have morphed from being a great way of connecting with people to something rather more insidious. People’s posts on Facebook are not inclined to suggest life isn’t that great - quite the opposite - and it’s almost as though someone’s living a parallel life. One exciting and vibrant on-screen while, off-air, life’s so much more mundane.

And, if observing your Facebook friends having so much more fun than you is not enough to make you feel a bit low, log on to Instagram where you’ll find even more to envy at just the swipe of the screen. Parties, pools, yachts, jewellery and glamour, glamour, glamour – it’s all there. 

No wonder Anna came up with her plan. And she’s not the only one. If you haven’t seen the Netflix documentary on the Fyre Festival that never was, I urge you to watch it. It’s a tale of how hundreds of young music-loving adults people were fooled into handing over great sums of cash for the experience of a lifetime that didn’t happen. Even those people who were unknowingly involved in setting up the “experience” were duped into believing it was possible. Unbelievable, and yet so believable at the time of happening.

What social media – with “old school” media playing catch up via digital means – has done brilliantly is tapped into our natural fear of missing out. Facebook and Instagram (other social platforms are available…) show the “other” doing something far more exciting than we are and, if we don’t take up the invitation, we’ll be missing out. We experienced it as children and were taught – either through good-enough parents or learning our own lessons – that it was not always so. Nowadays, we are being bombarded with images that tell us it WAS always so and if we’re not in, we are so definitely out, out, out.

It’s quite heartening, in a strange way, to discover we are not the “cool” and cynical people we imagine ourselves to be and we are capable of succumbing to our childhood state in which we accept unquestioningly.

Encouragingly, this herd mentality aspect of social media is now being questioned, with Danish psychologist Svend Brinkmann in his book, The Joy of Missing Out, encouraging people to disengage and live a more "moderate" life. 

Anna’s story is not new. People have conned – and been conned – throughout history. But the scam has usually only become known to a limited number of people. 

However, with the every-day prevalence of social media, what makes this so extraordinary is that you and your activities become known to millions. And, because of the huge influence (“influencers” is the term given to those people who make a living from their glamorous online postings) that social media has on our every day life, maybe it’s time we wised up. 

I once saw a keyring that said: “I’m not easy but I can tricked”. It made me smile. Maybe now's the time for a little less laughter and a bit more healthy scepticism.  


By: Lulu Sinclair

Photo by Elena de Soto on Unsplash



Sunday, 7 April 2019

Aphantasia - Blind In The Mind Part 2




Since my letter asking for feedback from anyone who had personal or professional experience of aphantasia was published in the March edition of Therapy Today, I have had some very interesting correspondence. 

Some counsellors curious about the etiology of the condition have wanted to know more about it while others have written to say they themselves experience it. One correspondent wrote as both counsellor and client, telling me she had been able to work effectively as a counsellor without knowing she had aphantasia, while being aware that  imagery and visualisation were limited. Many of those who contacted me indicated their interest in the subject and have offered their help in exploring this condition further.

It has also become evident from these responses that aphantasia manifests across a wide spectrum, varying from those – such as my client AF – who have no visual memory at all, to people who have only colour memory and others whose dreams are visual but other memories are not.  

One respondent explained how she had gained more depth in her perception (from 2D to 3D) through her own personal therapy which was an exciting development for her and raises questions around which areas of the brain are affected by this condition and the possibility of different connections being made. 

Dr Oliver Sachs, in his book, On the Move, described how a neurological limitation in one area of the brain can result in the more sophisticated development in another, and it would be interesting to explore this further in connection with  aphantasia.   

For example, my client, AF, works in computer technology and knows of others with aphantasia who work in the same field. That might indicate a higher facility in mathematical and analytical brain function.  

As far as the therapy with AF is concerned, he feels he is definitely happier. The depression and hopelessness that he presented with are no longer with him, nor is the need to consume alcohol in order to lift the burden of time which he experienced as so intolerable.  

We have been focusing on trying to find a purpose to his life that feels worthwhile to him and he has recently adjusted his schedule in order to give himself time for activities that feel meaningful.  

This feels quite a positive step forward and he is presenting with more energy and enthusiasm, although his activities are concentrated on helping others and trying to improve the state of the world, rather than on pleasures for himself which he struggles to identify or regards as indulgent. Even though he continues to offer great depths of compassion towards others, connections to his own childhood feelings remain difficult for him to access.

During our sessions, AF has been able to recall some childhood memories through narrative, but nothing that he identifies as traumatic (it is, of course, possible that these remain split off) and, as yet, there is no evidence to indicate that his mind-blindness has had any affect on his existentially dark view of the world. 

AF's therapy has also been a very interesting process for me because following his journey has led sessions to develop in unexpected directions at times while some other routes remain blocked, so maybe I have my own experience of a certain type of mind-blindness during this process.  

From the responses I have received, and my work with AF, I am left with a sense of the widely diverse ways people experience aphantasia and, indeed, react to it. 

One writer describes it as an advantage in that it allows them a less cluttered and more immediate experience of life that is not distracted by visual imagery.  

To my mind, this further challenges sensory “norms” and shows how careful we need to be in making any forgone conclusions about how any client may feel about any given condition. 

Integrative Counsellor and Psychotherapist 
BA (Hons), MBACP (Senior Accredited) 


Photo by JR Korpa on Unsplash  

   

Sunday, 24 March 2019

Suicide: The Decriminalisation Of First-Degree Murder Of Self


Suicide was once a criminal offence in the United Kingdom.  

Calling it “committing suicide” sparks an instant alignment with “committing murder”, and it is a sort of murder: murder of the Self. It’s the angriest act against the Self. Attempted suicide is a violent act of self-destruction.

We often associate suicide attempts with desperation, depression, powerlessness and having had enough of this world.  

However, if we look deeper into the wound from where suicidal ideation flows, we will find self-hatred, anger with self and others, and a drive to annihilate one's own existence.  

But what happens when you try, and it goes "wrong"? Wrong in the sense that you have survived it.  

In the 1970s, my psychotically depressed uncle escaped from the mental health hospital where he was supposed to be incarcerated for his own safety. He travelled to the underground station nearest to his family home, took himself onto the platform, and jumped in front of a speeding tube train, planning to end his life.  

Much to his disappointment, he fell between the tracks, and survived. What was the result of his failed attempt?  A new message entry went into the wound from where suicidal ideation flowed: "You can’t even get suicide right!"

Today, there are websites that outline the directions on "how to" kill yourself, not leaving any room for error. Specific methods, instructions and assistance on how desperate people can do a desperate thing in planning their own execution: a sort of suicide euthanasia. The UK Legal system considers these sites as possibly holding criminal liability for complicity in another’s suicide, if it can be proven that the person used the site.    

As psychotherapists and counsellors, where does our responsibility lie when a client utters: “I wish I were dead?”  

Are we to automatically raise the alarm with the emergency services; do we trigger our contractual agreement on breaking confidentiality; do we contact next of kin? Should we brush it off as the client seeking attention and should we really be in a position to play judge and jury on whether or not they mean it?  

How do we know that it isn’t a manufactured focal point so that we are both diverted from the real issue?  Is this how the client gets some extra care from their therapist?  Have we missed the risk factors that are staring us in the “blank canvas” of a face that we bring into the room?

There is a voice, a voice that criticises, condemns, punishes, and goads.  

When that doesn’t succeed, the voice morphs into an empathetic, understanding and gentle confidante, who understands the pain and the reasons why, and so helps with the plans.  

Should that not work, then there’s always the voice that lists all the affronts that have caused personal offence over a lifetime. Suicide suddenly makes sense again, to the one who is wishing to extinguish their own light.

So, returning to our responsibility in our role as psychotherapist and counsellor, I would suggest it’s up to us to work with each client individually to identify whether theirs is a cry of frustration or anger, a cry for help or, the most painful of all, a real and desperate cry of despair calling to us to try to help them find a way to give their individual life meaning.

The key, as always, is to listen, and to hear.


Photo by Lorenzo Maimone on Unsplash

Friday, 15 March 2019

Lent – A Fast For The Good




Christians around the world are presently marking the period of Lent, the 40 days that started with Ash Wednesday and leads up to the most important period in the Christian calendar, Easter.

Lent is the time that followers of Jesus Christ remember his life on earth – his time alone in the wilderness, his self-reflection and self-sacrifice, ending with the ultimate sacrifice: his crucifixion.

While there, Jesus prayed and fasted and it is through awareness of this that Christians use Lent to reflect on themselves, their lives and their own way of being – past, present and future.

Hand-in-hand with this often goes the giving up of something a person values – for example: chocolate, alcohol, sweets or cutting out something that means making a personal sacrifice of their own. 

And it is not only in the Christian religion that the idea of giving up meets with approval from teachers and followers. The Jewish faith has Yom Kippur, the Muslim faith, Ramadan, with each religion deeming an annual period of fasting necessary for spiritual understanding and growth.

So, can fasting be of benefit to those with no particular religious belief? And can it ever help us with our mental state? 

Modern fasting nowadays tends to be confined to a regular day or two rather than the 40 days in biblical times which stretches modern day thinking. We have an abundance of food choices in the developed world so it is hard to imagine such a time frame when we could go without food. Nor would it seem like a good idea.

However, we do seem to be connecting more with ideas from ancient times. 

Hippocrates was a great fan. The physician and founder of modern medicine on whose oath doctors still swear, advocated eating only once a day, using food “as our medicine”. However, he warned: “To eat when you are sick is to feed your sickness.” 

Similarly, the philosopher Plutarch advised: “Instead of using medicine, rather fast a day.”

Advocacy for this style of short-term food restriction or fasting is gaining popularity once more. To start with, there’s the obvious benefit of weight loss through a limit of calorific intake. For those who want to see a quick result, it’s likely to bring a speedy feeling of satisfaction and an improved mood. 

Then there’s the matter of what food affects us in what way - Hippocrates believed the mind and body were inextricably connected (i.e. you are what you eat) and the more we know about what a food is made up of, the more we learn about how it can affect us. 

Chocolate helps lift our serotonin levels, making us happy, as do carbohydrates, which can calm us down. Caffeine can perk us up but for some it may increase nervousness or anxiety. 

If all these foods may have an effect on an individual’s mood, it stands to reason that the process of fasting, too, may effect some form of change. So, going “cold turkey” and doing without any food so to speak might be just what the doctor ordered. (As an aside, isn’t it interesting how much our words and phrases are linked with food –  “cold turkey”, “I can’t stomach that”, “that’s too much to absorb”?)

There’s a lot of discussion on the internet about fasting and some research has been undertaken but not enough to come up with a definitive argument. 

As a result, I’ve done my own unscientific poll among a number of regular fasters to get their take on it. The general consensus is that, once the body has got used to the idea of no food, it works for them.  

Here are some comments: 

“I eat during an eight-hour window each day which gives me 16 hours of fasting. Some of that time, I’m asleep of course. Otherwise, in that period, I drink only water and green tea.

“Since I’ve been on it, I’ve been feeling much better. I rarely get depressed when following it. And I don’t get the lethargy of constantly digesting.”

Another continual faster said she tried to do it three consecutive days a month:

“I start off hungry but my body seems to be getting more into the groove – I’ve been doing this for more than a year now.

“After the initial period – when I’m definitely way outside my comfort zone – I do feel calmer and have more of a sense of tranquillity with me. I also feel a sense of achievement because I’m doing something that takes quite a lot of effort and control. I’m impressed with myself and that stays with me.”

This faster stresses she does eat during the period, but not more than 600 calories a time, and she makes sure that two of the days are at a weekend.

I’ll leave the last word to Dr Robin Lawrence, consultant psychiatrist and founder of 96 Harley Psychotherapy who himself believes in the power of fasting. 

He says: “I have become convinced that fasting is good for my mood personally, I generally try to stick to a 5:2 regime, though sometimes it’s observed in the breach ….”


Photo by Kamil Szumotalski on Unsplash