The psychology of film & TV, media, & work

Category archive

Work Psychology

By the time you read this blog, 480 million of your cells will be lost and resurrected. So, what excuse do you have for not changing?

in Editor Pick/Work Psychology by

According to science, I am no longer the same person I was when I was 34, roughly seven years ago. Every one of my cells has been replaced. In essence, I am a new person today. Pleased to meet you.

But I’m sort of the same. Possibly, I am exactly the same. I have the same sense of humour. I still chuckle at episodes of Seinfeld and make the same Simpsons’ references. I enjoy the same food and overall most of my family and friends still recognise me when they see me. Except when my wife says, ‘I just don’t know who you are anymore….’ Just kidding…I think.

I guess you could argue I’m a clone of all the Nicholas Ducks that replicate every seven years or so—just like everyone else, except if you’re six.

The idea that we remain the same yet are completely different is one of those paradoxes that baffles the mind momentarily until we get distracted by our most immediate concerns—another coffee, a deadline, a blog to write, Donald Trump. No, he’s no longer on Twitter. I wonder if Twitter had a drop in readership?

Back to the paradox…

As humans, we find ourselves in a predicament. On the one hand, we are constantly changing and adapting. No day is truly the same and our bodies are never static. Our minds never turn off. Even when we sleep, the mind continues to flow in dreams and unconscious processing, like a river that’s always flowing but somehow always exactly the same. We cannot escape our everchanging, dynamic and fluid natures.

Given that we are always changing or always in motion, it’s puzzling that we naturally resist change even though it can be perfectly logical and, perhaps, essential for our survival.

Personally, we have daily struggles with eating better foods, changing our fitness habits, pursuing meaningful goals, and maintaining relationships.

Freud used to refer to this as a struggle between the Id (our primal selves), Ego (our identity), and Superego (the ideal self or the self on steroids). The Id is that guy who tells you to have another slice of pizza when you’ve already had an entire pizza. The Superego looks on in horror as you destroy your weight loss goals. The Ego is the poor shmuck in the middle—you—who must contend with these two fools.

And then things get even more complicated. Most of our lives, we are also trying to ‘change’ other people—or Egos—too. So, you are not only having daily duels with your own primitive Id and Mr/Ms Perfect Superego, but you are also battling the Ids of your friends, families, and colleagues, their Superegos and their poor shmucks who are just trying to get through the day. Ever wondered why you’re so exhausted at the end of the day?

This is certainly one of the biggest challenges for managers and leaders in workplaces. You have to somehow inspire, order, convince, influence, manipulate, push, control, encourage, reassure, boost, embolden, sway, affect, stage, command, motivate, and stir others to change.

While it might be somewhat comforting to blame their Ids when the motivation is not there, there are actually some pretty simple reasons why implementing improvements and changes may not always work.

#1 People don’t agree with the change

It’s hard enough to influence yourself or others to change when the end-goal in mind is an aspiration. It’s even harder if you expect someone to shift towards something that’ll make everyone worse off.

Years ago, the trend was to move away from offices and utilise an ‘open plan’. Managers and others in authority were told it was better for the culture. It will improve communication and encourage leaders to interact more with their teams.

Although this may be true, in part, it also decreased privacy and perhaps sent an unconscious message to middle managers that their positions were not as important or as stable as they once were.

The reality was probably more practical. Utilising open planned offices saves floor space and costs.

Today, the trend has moved towards hotdesking. Not only is everyone in the same open environment, but the very desk you once claimed by the window—at least mentally—might be occupied the next day, and the day after.

Again, the hotdesking arrangement is often sold as allowing a more dynamic and fluid environment where you work from home and pop into the office whenever you feel like it. Employees, however, have practical concerns. Where do they put their books, notes and family photos? What happens if you come in and spend your morning looking for space?

Worse still, unconsciously, these kinds of changes may lead to a sense that the job is not permanent. When you communicate fluidity and dynamism, this can sound both exciting and frightening. The poor old Id likes comfort and safety.

#2 People don’t understand the change

Most changes in workplaces are never straightforward. Typically, you are needing to navigate a minefield of politics, egos (or Superegos), and complex technical challenges.

Imagine having to explain this mess to people! Well, that’s exactly what needs to happen. And, to make matters worse, there are many other competing for the attention of employees, trying to communicate their own chaos so that people can ‘get on board’ and support the improvements.

I remember working in the public service and having to produce detailed policy briefings that would eventually hit the Minister’s desk. Getting that briefing to the Minister was an art form that seemingly required dozens of policy writers, bureaucrats, administrators, and various other ambiguous roles. The briefing would work its way up and down again, landing on my desk with mark-ups (always black pen not red as red has become impolite).

Eventually, it would make its way to the most senior person in the organisation. On one occasion, after the 50th rewrite and weeks later, the Executive walked down to my manager’s desk with the carefully crafted briefing—developed by the cast of thousands—and said: ‘I’m sorry, but can someone explain this?’

On another occasion, the carefully crafted briefing was sent back immediately from the Minister who did not like the advice one bit. His Superego (with a touch of Id fury) wrote, in capital letters, ‘NOT APPROVED!’ The funny thing was, we weren’t actually seeking an approval.

This is also the reason why so many communications and emails from leaders are painfully sanitised because they are so concerned that their words will be misconstrued, misunderstood or just completely confuse the intended audience.

#3 People can’t process the good ideas

As much as we like to believe we can multi-task, the reality is that nobody does. They just rapidly switch their attention from one thing to another. And we all have our limit.

Even when you have a scheduled meeting or a phone call and are trying to sell an idea or new project, you may never truly have their full attention. People think about all sorts of things, like, ‘My pants feel tight today,’ ‘What am I having for lunch…wait, I can’t eat that,’ ‘Who’s this guy talking to me anyway, ‘Oh no, he’s waiting for me to say something.’

Have you noticed that in the recent influx of Zoom meetings caused by COVID individuals are clearly switching their attention to emails and other urgent priorities? You may have even been talking to someone and hearing the distinct tap, tap, tap of the keyboard. You know you have well and truly lost someone if you hear them typing.

Note: Sometimes people may be typing notes because what you are saying is actually important to them. So, typing can be a sign of complete interest or disinterest. Take your pick.

In short, in the sea of distraction and fragmented attention, it’s surprising that any new idea gets through. I guess if you are reading this and have got this far, that’s a great sign for me.

#4 It’s happening, but you can’t see it

That’s right. More often than not it you are seeking change, it can be happening under your nose. All changes in workplaces start with a change in thoughts, a new reflection, an eventual shift in attitude. Eventually new habits may form.

This takes a lot out of a person because it’s essentially a physical change occurring in the brain that sometimes means they must give up on a preconceived framework about how the world works. Because we aren’t privy to the physical dynamics of another person’s brain (except if you a neurologist and only if you are scanning it!) we need to infer the change by their actions. And this may take some time later.

This is part of the reason why your workplace, online businesses, Uber, and service stations are always asking you to rate your satisfaction. They are basically throwing their hands up saying, ‘I don’t know what’s going on in your head. Can you please let us know?’

And hopefully they know themselves and aren’t trying to save your Ego’s feelings.

#5 You are selling me something that you don’t even buy yourself

I once had to run a few programs in a past job where it was my role to encourage the use of a behavioural framework that I knew wouldn’t work. The aim was to integrate this new approach to projects that were under significant production pressure.

The last thing they wanted to do was attend workshops and be inspired, ordered, convinced, influenced, manipulated, pushed, controlled, encouraged, reassured, boosted, emboldened, swayed, affected, staged, commanded, motivated, and stirred into changing what they were doing.

I couldn’t blame them. People are generally interested in receiving help to navigate the complex world and all those grumpy Ids around them. They are less interested in being told they, themselves, need to change. That’s a hard sell and few want to be the salesperson.

I didn’t buy the program and they didn’t believe it either.

I do, however, stand by my thoughts and ideas in this blog. So, how come you haven’t changed already?

Asking questions makes you more likeable. It may even score you a date!

in Editor Pick/Work Psychology by

When I was studying psychology, our statistics professor had a strategy to get people to pay
attention. She would systematically work through the class attendance sheet and quiz students on
the spot in the lecture theatre.

I still remember the first question she asked me: ‘Mr Duck, what does an alpha of .05 mean?’ Don’t
worry, I won’t bore you with the answer 95% of the time (terrible statistics joke).
This was, at first, a most troubling scenario. Most students were content with sitting back in relative
obscurity whilst the lecturer did all the heavy lifting. If you switched off for most of the session that
was ok. It was the university equivalent of workplace presenteeism.

Yet, for some reason, the fear of being drilled on a statistics question at random was enough to have
every student sitting upright. They even seemed to be well prepared before the lecture.

In many presentations, workshops, training and meetings I regularly observe that the room is split
between people who say very little and those who do a lot of talking.

There are probably a few reasons for this:

Lacking purpose
If you are sitting down with a surgeon, you will have some questions lined up. You listen to their
advice. This is because there is a clear purpose to the meeting that motivates you. Meetings in other
workplaces often fail to set a clear purpose. People can sit around for hours discussing issues
without ever getting a real outcome.

If set as a regular meeting, the attendees will soon switch off completely. They will be physically but
not mentally present (presenteeism). They may even start playing with their phone or completely
ignore everyone while they respond to emails on their laptop.


Fear of offending
If you’ve ever watched the show Shark Tank—where would-be entrepreneurs pitch a business deal
to successful entrepreneurs—you’d recall that the hosts of the show are never afraid to put people
on the spot. Time and time again they question the guest on their business model, even if it
means destroying the person’s morale, their enthusiasm and their ideas.

However, in most areas of our working life, we avoid Shark Tank scenarios. Most of us are also
sympathetic when a colleague needs to present or if an entrepreneur is introducing a business idea.
We are always trying to find a balance between getting along with people and having difficult
conversations to improve productivity and quality. As an entrepreneur, you always want to hear the
positive even though constructive feedback is more valuable. Workplaces that value politeness and
harmony over business results can end up with individuals pursuing bad projects and ideas with no


Fear of public speaking
In many instances individuals don’t speak up because they fear any kind of public speaking. When
you ask a question or put forward your idea to a group, you run the risk of looking foolish or ignorant. More often than not you are probably asking the same question everyone else has been
thinking about!

Research from the Journal of Personality and Social Psychology suggests you may have little to fear
from asking questions. In fact, across several studies, individuals who asked more questions were
perceived by others to be more likeable.

In one study the researchers were even able to measure the question-asking behaviour of people on
speed dates. Individuals who asked more questions were more likely to get a follow-up date.
The researchers suggest that individuals who ask questions are perceived as responsive, which is
associated with listening, validation, understanding and care. Importantly, the researchers found
that individuals typically do not think they are liked if they ask more questions.

For the person on the receiving end, it is a sign that they are at least interested in the ideas you are
presenting. Someone passively nodding in agreement is likely to be a bad sign that the other person
wants the conversation to end quickly.

In a job interview this means that you may be more likely to get the job if you ask questions. At the
same time, you may also learn something from the answer. A win-win for you.

How was this blog? Was it ok? What could I do better?

Why we look like our names…or does this mean I look like a duck?

in Editor Pick/Media Psychology/Work Psychology by

For many of those who know me, I have had a bill for a mouth and a feathered complexion for many years. Let’s not be silly here. Aside from the u, c, and k following the letter d, there’s nothing about this name that reflects who I am. It’s just a name, right?

Wrong. According to the Journal of Personality and Social Psychology, a person’s name may be more closely aligned to our physical appearance than you would think. In one study, when shown a photo and a list of possible names, participants seemed to be able to link the right name to the face more often than could be explained by chance.

Suddenly, all those moments you had when you were thinking, “He really looks like a Peter,” may have been right all along.

The specific mechanisms why we can do this are not entirely clear. However, when researchers were presented with unfamiliar names from other cultures, the effect vanished. The researchers suggest that the stereotypes and cultural norms about names establish mental “schemas” that people use to predict how a person looks.

In the absence of these schemas, there is no easy reference we can use to make accurate predictions.

One possible explanation is that we simply match the name to the face. That is, that little infant, Bob, looked like Uncle Robert when he first made his way into the world. This seems unlikely. First, many parents have a name selected well before the delivery day.

Second, we have all been in a situation where we hold off saying “her” or “him” when meeting an infant for the first time. Why? Usually, babies look very similar and even gender is difficult to differentiate during infancy. I can recall gazing lovingly at my first-born daughter in the hospital only to realise I was staring lovingly at some other baby.

This also explains why babies sometimes get sent home with the wrong parents. They don’t usually have pronounced facial features until much later. Did you ever take the wrong toddler home? No? What about the wrong spouse? Ok, let’s not discuss that one.

The findings from this study underscore the importance of social identity. Whilst many of our personal characteristics are strongly wired at birth through genetics, our social and cultural upbringing also shapes what we value and how we behave.

Fashion, for example, continually changes and influences how we dress, style our hair and groom ourselves.  As such, our outward appearance can reflect the expectation of society. Perhaps our names may subtly influence how we style our hair to conform with preconceived ideas of what a person with “that name’” looks like.

Interestingly, one of the key physical characteristics that we can change quite easily is hairstyle, and this was found to be a cue for name recognition. That is, when people accurately predicted a name, it was often based on matching the person’s identity with their chosen hairstyle. Participants did not realise this, of course, but the researchers could determine this based on measuring where participants were focusing their gaze.

I’ve often wondered what it would be like to be called “Nick Smith” or “Nick Jones”, something very different from the “Duck” surname. At school, it was a non-stop circus of quack, quack jokes and Donald/Daffy Duck references.

A name like “Duck” could be a curse if you always wanted to be taken seriously and if you wanted to blend in with everyone else. But it can be a strength if you value being different.

In adulthood, the jokes remain but I’ve learned to appreciate having a relatively silly and comical sounding name. People also seem to like pairing it with “Dr”. Perhaps the juxtaposition with such a formal title is pleasing for people.

A good name can also make you more endearing. There was that clip of Melbourne-based news reporter Amy Parks finally giving a news report from the Melbourne-based “AAMI Park”. This mere coincidence gained the reporter at lot of attention and social approval.  That clip currently sits on a couple of hundred thousand views on YouTube.

A name may not always be so harmless. There are apparently a lot of people sharing the names of notorious fictional serial killers (96 Norman Bates, 12 Jason Voorhees, and five Freddy Kruegers) in the United States. Is it a nice icebreaker to be called Norman Bates or something that would feel a bit creepy?

Interestingly, how much you like your name is also related to your self-esteem. Individuals who rate their name lower than others also tend to have lower self-esteem. Self-esteem is believed to be a gauge of social acceptance. That is, lower self-esteem indicates that we feel less accepted by our social groups than individuals with higher self-esteem. So, we now know that our names can tell us a lot about how we feel and behave. These things are intrinsically linked to fitting in.

Over the years, I’ve also accumulated a number of odd friends and colleagues. I sometimes wonder if my own experiences with my name have led me to look to other quirky individuals with a similar sense of humour, or strange peculiarities that make them black sheep–or black ducks–in their own right.

Have a think about your own name. Has it had any influence on your friends, colleagues, or career? Would a different name elevate your status in a job interview? Is it associated with a sense of pride or something you’d rather redefine?

Quack, quack, quack.

Not interesting in having you mind blown? Stop reading now

in Editor Pick/Media Psychology/Work Psychology by

Ok, so if you’re not in the mood for having your mind blown, stop reading right now.

Still here? Ok, here we go.

The universe is about 93 billion light years in diameter. Given that light travels at 300,000km per second and this means one light year is about 9.5 quadrillion kilometres, I won’t even try to express the size of the universe. In short, it’s big, and it makes us incredibly small.

Ok, you’ve probably heard all this before, right? Apologies for the ruse. You already know the universe is big, complex, weird and terrifying/amazing all in one. Your mind is probably not blown at all. But it should be. Because the universe is incredibly awe inspiring. Let’s not forget that.

Like most people, you’ve probably siloed this confronting idea in the deep, dark recesses of your mind and have got on with your morning coffee and emails. Good for you.

If you don’t spend your days working in astrophysics, it is understandable that you probably don’t want to think about this too much. It can easily make your day feel rather pointless in the big scheme of things. It can easily make the activity across the globe and time itself trivialised.

Even when writing this blog, I really needed to make an effort in bringing in the universe example, rather than something more accessible—say, Donald Trump—as it overwhelms me even to think about the universe and existence even for a few small moments. Yet, I seem to allow time for Trump. What a strange fellow he is. But how strange that we can’t stop writing and talking about him whilst something like the universe is there crying out to be noticed.

So, I’ll do you a deal. I’ll get on with writing this and you’ll get on with reading this and then we can return to our simple, coffee-drinking, Netflix-watching, Facebook stalking existence, which is just so much more enjoyable.

But if you take the time to remove yourself from these smaller activities, it is truly amazing to consider we we are a part of something much greater and inspiring than our morning cup of coffee. The universe, and similar concepts that are much bigger than ourselves, promotes awe, recalibrates the ego and can connect us with our peers, society and culture.

Awe is what got the man to the moon as people watched on from the comfort of their homes. It inspired science fiction shows and films, elevating the unknown and passion for space exploration.

Research from the Journal of Personality and Social Psychology indicates that awe can also be used for our more immediate and modest goals.

Across several studies, they demonstrated that priming participants with a scenario or memory that inspired awe also promoted more ethical behaviour, generosity and values associated with others and nature, rather than Trump-like ambition and power.

According to the researchers, awe makes us feel ‘smaller’ and connects us with a broader purpose and collective. In one study, simply reflecting on a large Eucalyptus tree subsequently promoted the prosocial values of individuals compared to those who reflected on an office building. Presumably, even nature—its history, complexity and connectivity—can inspire awe.

It seems that efforts to promote teamwork and collaboration within workplaces could be better spent on ensuring the workplace has these moments of awe. All the better if a workplace can anchor its entire vision, purpose and goals behind something bigger and awe-inspiring.

Too often I see workplaces take the easy way out. It is easier to examine an environment that is deficient in behaviours, values, communication, and business plans. It’s much harder to diagnose a problem with purpose and inspiration.

It’s no wonder we notice others so quickly reaching for the computers and smart phones. These are not people who are caught up in an awe-inspiring workplace. Their internet connection is more valuable because it connects them with people, ideas, creativity, and fun.

Many workplaces seem to recognise the importance of vision and purpose as each year they produce plans and write strong ‘purpose’ statements. But how often are these documents really awe-inspiring? The visions and purposes are most often mediocre that are interchangeable with many other organisations. They are basic enough not to offend and vague enough so that the workplace’s activities easily align with the goals. Employees often don’t care too much about the workplace vision and goals and get swept up into the day-to-day tasks that have more immediate implications.

But let’s get back to work now. As important as it is to be inspired, we also have a day job, emails to write and coffees to drink. We have Facebook posts to draft and LinkedIn accounts to manage. Perhaps the universe has a grander purpose that even makes sense of all these emails, status updates and coffees on planet earth?

Ok, now what’s Trump up to today…oh dear.

Weinstein, the Nazis and you

in Editor Pick/Media Psychology/Work Psychology by

In the red corner is Harvey Weinstein. Weighing in at over 250 pounds and a net worth of $250 millon. Nominated for over 100 awards, Academy awards winner, with an influence over some of the biggest names in Hollywood.

In the blue corner, every single person in society including: disgusted members of the public, former actors and actresses who knew directly or indirectly of his actions, assistants, victims of abuse, and almost every person in the media.

Each day, the number one story across the globe appears to be Harvey Weinstein. Yet another person comes out to share their experience. And, each day, commentators in the media and important global figureheads frown with disapproval. ‘Why would so many people stand by and let this happen?’ Indeed, if you weren’t confronting Weinstein, you have been dubbed an ‘enabler’.

The reason why the Weinsteins of the world do what they do without fear of retribution is something that has long been studied and understood by social psychologists. And it has to do with an anecdote about Nazis. Yep, our favourite real life and Hollywood villains.

Early studies in social psychology attempted to explain how seemingly normal people could commit atrocities, like the Nazis in World War 2. Were these people truly evil or placed in circumstances that made them do horrible things?

Many are familiar with the Stanley Milgrim experiments where under pressure students would administer seemingly painful electric shocks to other students. Some would do so even to the point of the other student screaming in pain. However, this was all a façade. Nobody was truly in pain. The study was simply examining whether a normal person would follow orders even in the face of cruelty.

Since these early experiments, social psychologists have demonstrated that people behave in peculiar ways when surrounded by others. For example, we are willing to ignore or downplay evidence so that we can maintain harmony with a group—groupthink. Some believe this can lead to catastrophic outcomes when risk is downplayed and overlooked.

Many of our phobias are related to how we are perceived by others. We may fear public speaking even though there is no true physical threat. Job interviews tend to be the more stressful than they should be. The first day of school or a new job are confronting experiences because of the unknown social aspect.

In public, we all instinctively conform to fit in with our surroundings. How many of you feel uncomfortable to hold a phone conversation on the train in the morning when everyone is quiet? How difficult is it to disagree with the majority in a workshop when it may mean slowing down progress or having to debate an issue?

It’s probably not too surprising to social psychologists that Weinstein was able to do what he did. Through his sheer physical size and powerful personality, he could intimidate. But he also has a ridiculous amount of money and influence from his position. If you’ve ever hesitated about speaking up on a workplace issue, then imagine how impossible it would be to challenge the might of Weinstein, surrounded by others who played along.

But, interestingly, research also shows how individuals can overcome intense social pressures. In one study, a participant was asked to judge whether a line was shorter, longer, or the same as another line. If they were placed in a room of people who purposely misjudged the length, the participant would also align their view with the rest of the group. However, if only one person disagreed, it was enough for the participant to feel comfortable to disagree.

Doesn’t this sound like what’s happening now with Weinstein? All it took was a few people to speak out to give others the confidence to do the same.

What we can learn from Weinstein isn’t just a lesson on morals, decency, and corruption. It is also a lesson on how we as individuals can fight the social current in any context and bring out change. You might even find people jumping in to support you.

Ding ding, ding!

It’s simple, just do the Opposite

in Editor Pick/Work Psychology by

I started our business Opposite two years ago.  Since then I have done the complete opposite of what I used to do. I sleep during the day instead of night. I walk backwards. I yell at people in the cinema rather than whisper. Backwards blogs my write even I.

Ok, I am not so much of an extremist that I would ever take the concept of Opposite to absurd levels. The idea was simple. Perhaps there were workplace practices that were so broken that doing the complete opposite was the solution. In short, we needed a name that implied we were willing to challenge the status quo.

Here are a few successes and challenges along the way.


The successes

Do you really need that procedure?

One of the most enjoyable component of work has been able to challenge the status quo on procedures and processes. We have developed simple websites and mini-workplace tools to replace boring, lengthy and unreadable procedures. We’ve engaged people with graphic design competitions and gamified solutions to deploy business processes. At its core is the idea that procedures and processes are often viewed as important technical documents rather than engaging instruments of change.


Human Factors

When I started Opposite, I was planning on moving out of the field of Human Factors for a change until I realised it was an area that people were talking about. This contrasted my previous 10 years experience, educating people about the importance of Human Factors. Somewhere in the past few years, it has become topical.

People are recognising that many failures and problems with technology, infrastructure, and workplace processes need Human Factors thinking. That is, a formal understanding of how and why people interact with their environments.

In many ways, considering Human Factors in they way we conduct work is ‘opposite’ to the way it used to be done: build it and then get people to adjust to it.


Developing solutions not just insights

I used to roll my eyes when consultants would come in to educate the business about principles and frameworks. I didn’t need someone to tell me that the customer is important or that clear accountabilities are critical. Most of us understand the basics but just need help coming up with the ideas and solutions to address these principles.

At Opposite, we didn’t want to just write a report with 20 recommendations and move on. We are more interested in helping workplaces develop the specific tools and workplaces practices that help implement the recommendations. This is the hardest part for workplaces and consultants because nobody has the precise solution.


The challenges

Paperwork culture

An unfortunate element of work today is the fear of prosecution when designing a solution that has safety implications. The end result has been many workplaces are so afraid to try the opposite and  innovate. This is unfortunate because most if not all people want to see greater focus, more creativity, and activities that add value. Paperwork often serves one purpose—to cover the workplace—not to drive business improvements.



Time continues to be a predator that pursues you at every turn. Workplaces have increasing demands and seem to fit more and more in everyday. Our workplace is no different. Since running my business out of our humble home in 2015, we are now working with five consultants. With more people, it seems that there are even fewer and fewer hours in the day. In the tradition of Opposite, more resourcing has seemingly left less time for home and recreation.

I have yet to see a workplace that somehow reaches equilibrium and strikes the right balance of work and recreation. Work is addictive, rewarding, restricting, fun, and stressful. The creative projects sometimes get put on hold because we simply need to deliver that report.


Next steps

Going forward, we are venturing into finishing some new projects and ideas. Here are a few that are in production:

Launch. We’ve developed the first prototype of a workplace productivity tool ‘Launch’. This has been a slowburn project that is nearing completion for testing.

Brandbattle. We’ve developed a website that pits brands against each other in a competition. The website is developed as well as a tool that assesses your ‘logo personality’. Just a few more tweaks and we’ll launch it soon.

Gamified & Interactive Training. We are currently developing a game-based team development program as well as Human Factor training that allows users to explore and immerse themselves in 360 environments. Sounds cool. We just need to address that ‘time’ issue above so we can devote the time it needs.

We are also looking to partner with organisations and to keep testing our ideas. If you have a particular workplace issue that needs some opposite thinking, then drop us a line.

And thanks to everyone who has helped support and grow Opposite, especially our team: Conor O’Brien, Marty Lynch, Christine Antoniou, Patrick McGrath, Ray Misa and Andres Meneses.

A few misconceptions about working for yourself

in Editor Pick/Work Psychology by

Running my own business, I don’t presume to know it all.

But now that I’ve reached the milestone of two years in business, I have noticed a few misconceptions about working for yourself that just don’t add up.

Here they are:

#1 You need to work harder

There is no doubt that I have needed to work hard but, you know what, most people do regardless of whether you work for yourself or not. Day in and day out, I work with people in permanent roles who continue to take on more and more work, weekend work, and endless intrusions from their mobile phones.

#2 You have less job security

I was warned from day one by other consultants and colleagues about how in small business you wear a lot more risk when the work dries up. I hope our team continues to be seen as useful well into the future but my experience to date is that working in big organisations is riskier.

In a big organisation, restructures occur regularly. If the workplace is too slow to change to a diminishing market, a common strategy is to lighten the load through redundancies. Even if you are clever, handy, motivated, and committed, you can still find the organisation dispassionate and cold when only years before it was welcoming you with open arms.

In small business though, you have agility. You can move with the market and you ultimately have a lot of passion for keeping you employed because, well, you’re you.

#3 It can be isolating and lonely

I fortunately had a great colleague who offered a chance for regular coffee catch-ups because he knew working for yourself can be isolating. But it can also mean reconnecting with dozens of people who you haven’t seen in years.

I’ve been fortunate to have been able to work with and catch-up with former bosses, friends and colleagues. Every trip to the city can mean squeezing in time to see someone you haven’t seen in years.

Not to mention that I’ve been able to invite friends and study companions to work with me. I’ve reconnected with half a dozen peers from my University who I would otherwise struggle to see once a year if ever.

So, working for yourself can actually promote your connectivity with people.

#4 Your work-life balance is thrown off

I am actually writing this blog on a Saturday afternoon. It’s a nice enough day outside and my kids are home. What am I doing?

Well, working for yourself means that each day can be seen as a work day. It can also mean each day can be time with the family. You can work from home or take a few days off without getting approvals.

You can even invite your family to work with you. Every week, my wife comes in to help us run the office. If I was working for another big organisation, she would have found a job in one just like me and we would have seen each other less often.

Things are not thrown off balance. They just end up being different.

Now, excuse me while I eat my cake too and invest some of my Saturday on the nice day outside.

Happy birthday to us!

How I apply Twin Peaks to work

in Editor Pick/Film & TV Psychology/Work Psychology by

‘People will be looking at this. We don’t know what will be there in 50 years. We have to think about how it will suit the community in the long-term!’

‘No, at the end of the day, it’s just a concrete box. End of story.’


I observed this tense argument between an architect and construction manager about the design of a functional building. They had had finally met on the eve of construction where their competing priorities and ways of perceiving the world brought them into conflict.

For the architect, the design was everything. It was a place for people to work. It was something that would be viewed and either hated or admired. They perceived the building in a holistic way.

For the construction manager, it was a functional building with a budget and time-frame. One of hundreds he would see go up and, one day, come down.

There was clearly a gap between the way they perceived the intended purpose of the building. The architect viewed the infrastructure as a component of the broader community. The construction worker perceived the building as a functional tool that would be operated and maintained.

These perceptions and the way we categorise the world inevitably leads to disagreement, conflict and can also limit or enhance our creativity.

When we are infants, we learn to recognise the blurry images in front of us and assign them labels from the words we hear. We learn to make distinctions between objects and assign them distinctive qualities so that we can navigate the world.

Over time, our ability to categorise the world become more inventive and complex. We form abstract ideas like ‘art’ and ‘science’ and silo our ideas and thoughts around what these constructs mean. For example, we learn to associate numbers and formulas with mathematics. This construct is related to order and predictability. In contrast, art comes to mean freedom, creativity, and expression.

Psychologist, George Kelly, referred to concepts like this as ‘constructs’ as part of his personal construct theory. Kelly believed understanding constructs was the key to unpacking the mind and behaviour.

Constructs start off black and white. There’s ‘good’ and ‘bad’ and ‘yummy’ and ‘yucky’. But over time, we become better at the shades of grey.

For example, art is initially viewed as the activities we provide children, like painting, drawing and craft. But over time, children and adults learn that art can require a level of mathematical precision—perhaps scientific accuracy—when applied.

I recall when I was at school, a talented friend of mine drew a shape and asked me what I thought it was. I guessed but I was wrong. It was just a ‘line’. I was looking for meaning but he was educating me about the practical reality of art. Art could be broken down into how to apply a simple logic of geometry and measurement that made him a better artist.

Similarly, maths and scientific theories, although orderly and logical, ultimately need to be expressed and communicated in language, which can be highly subjective and even creative. The more complex concepts, like Quantum Mechanics, requires a level of abstract thought and imagination that is beyond most artists and possibly most people on earth. Einstein was famously quoted as coming up with the concept of relativity by fantasising what it would be like to ride a wave of light.

In short, strict adherence to how we view a construct, can limit our thinking and, hence, our progress. When we allow something to challenge our constructs, it can lead to something more original and memorable.

The most recent time this has occurred for me is revisiting the revived television show, Twin Peaks, which has now returned to television after a 25-year hiatus.

The original show came out in 1989 and drew the audience in with a classic ‘who dunnit’ murder mystery. This central plot played to the logical murder mystery construct. That is, we expected to follow a detective solve a series of clues until the killer is revealed in the final act.

But director and co-creator, David Lynch, disrupted the genre, introducing surreal and non-traditional elements.

There were dream sequences involving mysterious clues provided by a dancing dwarf, a giant, and the murder victim herself. The primary protagonist, Agent Cooper, used intuition and insight to help him solve the mystery. One scene showed Cooper state the names of suspects as he hurled a rock towards empty bottles. The rocks that hit led to prime suspects in the murder.

In the finale of the original show, he eventually visited a strange parallel world between worlds and came face to face with demonic spirits who seemed to be at the bottom of the murder mystery. So much for ‘the butler did it’.

Lynch deconstructed the traditional murder mystery, which infuriated many viewers who simply wanted a beginning, middle and end to fit their existing schemas.

The result was something unique and memorable that invaded popular culture and has since influenced many popular shows for decades.

The new season of Twin Peaks has taken this notion even further.

A whole episode was devoted to the first nuclear test in New Mexico in the 1940s that catapulted the audience inside the explosion where a parallel universe was seemingly unlocked. For 45 minutes, we are invited into an extensive, surreal television experience that would challenge the obtuse constructs of most television audiences.

To many it would be perceived as unstructured nonsense. To others it provided meaning in a non-linear and structured manner.

To some extent, Lynch returned the audience to their infant state again trying to find meaning in the blurry images and sounds. It’s uncomfortable and challenging because it doesn’t fit an existing construct.

Benefits of Challenging Constructs

Aside from the enjoyable/frustrating Lynchian trips to the surreal and abstract, I believe there is a modest application of this kind of thinking that can be applied in more practical ways.

Creativity can be achieved by relaxing the boundaries between constructs.

Computers were once considered a tool of a programmer or computer technician. The average person didn’t use or even know they needed a computer. The visual interface, introduced by Apple, destroyed this artificial boundary.

The computer was no longer a computational tool. It became a way of facilitating every aspect of life and work through automation.

More recently, this new way of thinking has opened new models of operating and has disrupted whole industries through the internet and smart-phone apps.

My first experience in an Uber initially had to overcome an apprehension that I was sitting in a strange car with a strange person. Of course, I had been doing this my whole life with taxi drivers. The construct of how we travel and who can provide the service needed to evolve.

I’ve also learned to embrace the ‘cloud’. My original construct of ‘security’ was that I had my valuable information in a close, secure, physical location like on a backup drive.

But if it’s stolen in one place, like on my laptop, then I will never see it again. The construct of security needed to evolve to appreciate that information is sometimes more secure when it isn’t limited by being contained in one place.


Redesigning Workplaces

As part of Opposite, I’ve spent the past couple of years using an approach called process engagement, which deconstructs existing constructs—that is, ways of thinking and structuring our work—to help develop creative solutions as well as reset thinking to simplify workplace processes.

Here are a few examples of how it helps:


  • Initially remove the labels and categories from the existing processes, structures, and roles. The preconceived structure and language can bias you before you can proceed. For example, if you assume someone has a role ‘administrator’ you will ensure there is administration for them to complete.
  • Set some parameters around the process redesign so that the task isn’t aimless but make sure those parameters do not box you into designing the same thing again.


  • The focus should be on defining the users/customer needs not their preferences. The preferences usually come from preconceived ideas or ‘constructs’ about what works. For example, the user may assume they need a new procedural document but they may need a training program or to remove some procedures.
  • Don’t assume that stakeholder consensus is a positive sign. Avoid the need for a solution to appeal to all stakeholders. Inventive ideas can challenge constructs. If some individuals are pushing back, this doesn’t always mean you are wrong. It might be a sign you are on the right track.
  • Allow your own ‘fresh’ ideas to be challenged. Your ideas may be the most ‘construct-challenging’ but challenging a construct is not the aim in of itself. It still must bring about something meaningful.

Design & Implement

  • Eliminate visual cues that prime an instinctive response. Like Lynch, you don’t have to always convince someone in a rational, linear way. You can simply make something ‘feel right’. For example, there is a tendency to create logical systems and processes look like complex blueprints or contractual documents. This primes the user to expect something monotonous and so they won’t read it.
  • Prioritise aesthetics. Many people make the mistake of assuming graphics and attention to fonts is something that you do at the end. Aesthetics should not be thought of an enhancement to the solution. It is intrinsically part of the solution.
  • Get the solution implemented immediately for testing. There is a tendency to keep refining an idea until it’s ‘safe’ to distribute. But it is much more beneficial to engage users in a collaborative process as you build and design.

Don’t blame me. The ‘system’ wrote this

in Editor Pick/Work Psychology by

My friend was holding a can of soft drink and as he checked his watch he poured the drink on his foot. A person in my line of work—Human Factors—would call this ‘human error’.

How would we interpret the situation if, instead of laughing, I simply rolled my eyes and lost respect for my friend?

What if they ruined their shoes?

What if it left a puddle on the ground that led to someone slipping over?

What if after slipping over the person cracked their head and died?

What if that unfortunate person was also holding the cure for cancer and now this silly act of tomfoolery had led to the unnecessary suffering and death of people all over the planet?

But let’s say my friend knew they were going to pour the drink on their foot. It might be for a laugh and to get a reaction—to play the clown. This would no longer be an error but a form of intentional behaviour. That is, there is some additional calculation of the brain that determines the behaviour is a worthy idea.

So now, instead of being a catastrophic event caused by a harmless error, my friend is culpable. That few seconds of planning and intent is everything.

The cause of the event can be found in the deep, complex recesses of my friend’s brain. Somewhere in there neurons fired in unison and sent signals to my friend’s wrist to twist and pour the drink. Somewhere in this brain there is something to blame and assign fault.

Alternatively, the blame is elsewhere. It could be attributed to the broader system. My friend may have been trying to impress me with their sense of humour. So, he was under peer influence.

The soft drink can manufacturers could be blamed as they designed the can. They also failed to display a warning message that these kinds of events could occur.

The surface on the footpath may be to be partly to blame. Surely, it shouldn’t become slippery from a small amount of liquid.

Local councils may have under-invested in the quality of footpaths due to a broader systemic issue related to funding.

The funding was the result of an economic downturn and, yep, we were willing to tolerate the possibility of soft drink-related deaths so we could save a few dollars.

Perhaps even the broader culture is to blame. After all, we live in the age of YouTube videos and Facebook where individuals love to play the fool to get some much-needed applause from their peers.

Of course, if we play out a genuine scenario where an error—as harmless as it can be—led to true catastrophic events, the same basic logic is often applied after the event. What plays out time and time again is the extent to which a person caused a problem and how much of this was caused by the ‘system’.

I feel deeply uncomfortable with blaming individuals even when they choose to do silly things. This is because I sometimes do silly things myself. Likewise, I feel deeply uncomfortable with blaming the ‘system’ as it leads to a whole host of other implications.

Importantly, blaming behaviour on the ‘system of influences’ suggests that we must also accept that success, bravery, creativity and acts of kindness are the result of the system. Nevertheless, we often seek to praise and reward individuals when they demonstrate these positive attributes but can quickly revert to blaming the system when they display poor behaviour.

Is the system causing these things or not? I’m not sure we can have our cake and eat it too.

The heart of my discomfort is probably related to the concept of free will. When we seek to blame individuals for their mistakes and punish them, we must also assume that they have the free will to choose this action.

When we blame the system, and argue a complex series of events over time culminated in the event, making the individual a passive participant in the transaction of soft drink homicide, we imply that the individual does not have free will.

Systems thinking might be seen as a cover for deterministic thinking.

Deep down we want to blame people because the idea that we don’t have a choice in the matter is also alarming. If I do not have choice, then what am I? And can I celebrate my successes? Who’s typing this blog anyway? The system?

And if people generally feel more comfortable blaming others then this is ultimately a product of the system too. So, we have a deterministic system that basically advocates free will. Is your head spinning with this pop-philosophy?

There is, of course, a softer conclusion to draw. We might argue that individuals have choice but are heavily influenced by their past and immediate surroundings. Somewhere in my friend’s brain, the system has contaminated their intentions but those neurons still have the capacity to side-step the infection and come up with an alternative.

The individual, according to this view, triumphs over the system. But, then again, how did the brain achieve this? Aren’t those neurons ultimately a product of the person’s genes, development and experiences? That is, all elements of the system anyway?

So, when I see someone actively trying to force blame on individuals, I believe we are no better at understanding individuals—perhaps much less so—than we were thousands of years ago when ancient philosophers debated free will and determinism.

Deep down, they are reconciling their discomfort with determinism like the time Aristotle pretended to spill wine on his foot to get a good laugh…


Stairway to heaven…sorry, work

in Editor Pick/Uncategorised/Work Psychology by

As a youngster, the sight of the escalators at Parliament station in Melbourne was the trigger for great excitement. The steps climbed almost vertically, way off into the distance. The destination was The City. It was full of activity, toy shops, nice food and video arcades.

Flash-forward, say, 10 years. When I started my first real job – the job where you had to start being all serious, dress professionally and conform. The escalators no longer represented that joy and expectation.

Instead, looking around at all the melancholic faces on a Monday morning as people filed up the escalators was downright depressing. They reminded me of soldiers marching off resignedly to war.

There was a faster march for some—about five per cent or so—who somehow generated the energy to walk upwards.

But most of the people around me didn’t look happy. They were gloomy. The weekend that promised so much freedom had passed. Now they were nursing hangovers and anticipating the long week ahead.

Today when I visit this Parliament station, I still see those same faces. Not the same people but the same faces nevertheless.

Those sometimes promising, sometimes demoralising steps remind me of an anecdote from the satirist and comedian Barry Humphries, alias Dame Edna. Humphries recalled sitting on the tram going into the City as a young man, carefully observing everyone around him in their suits on their way to their office jobs. He knew then—instinctively—that real work was not for him.

So, Humphries became one of the lucky ones who managed to sidestep adulthood and drudgery and live a life of games and make-believe.

Most of us elect to join the people in suits. How do you feel about it?

This is, of course, a most pertinent question. People in Australia who are lucky get to live the average life span of about 80 years. If they work from 18 to 65, that’s about 47 years or 2,256 weeks. Basically, 85,728 hours or so.

That is a lot of time to play or work with. It can be spent working hard from day one, using your body and mind to toil away so you can start saving for a house or pay the rent, buy a car or travel the world.

You might put another plan in place, devoting the time to education and postponing the immediate monetary rewards for a longer-term strategy. You might trade an hour now to make your future hours worth two, three, or perhaps many, many more, depending on how lucky or clever or resourceful you are.

You could invest in a career or university degrees.

You could squander the hours by making a bad decision or simply having bad luck. Perhaps that career or degree wasn’t really right for you or maybe the industry changed so fast that you’d already become redundant. You could lose a good 20,000 hours of your career doing that.

Alternatively, your strategy might be to use those hours to manipulate and thrive off the existing corporate structures and workplaces that are doing well to get ahead through your networking and devotion to a team and workplace.

Perhaps you could go it alone and vow to run things your way. But, ultimately “your way” is trumped by the voice of the customer. The hours, one way or another, are always devoted to helping other people and following their plans.

Of course, most people use a combination or all of the above whilst other aspects of life come and go. Marriages and relationships blossom, kids enter the scene, and travel and entertainment lure us away from responsibilities. Then there’s the rude interruption of illness and death. But I had all these hours…

That 85,728 hours is a lot to play with but you don’t ever get them back. Some of those hours or perhaps even a good few months, can be spent walking up stairs to a place where you really don’t want to go.

I’ve now spent a good hour at least writing this blog. I haven’t written in a while because I’ve been so busy investing my hours in other things.

The ultimate dream was to be able to work for myself so that I could have the freedom and flexibility to write and reflect, which is what I enjoy the most. Hence, this reflective and meandering blog.

I guess sometimes you have to put all those hours aside and just do what you like to do. Right Barry Humphries?

What Super Mario Bros teaches us about motivation

in Editor Pick/Media Psychology/Work Psychology by

People who were around during the moon landings often tell me what it was like that day. I don’t really have a moon landing story. But I can tell you about the time I was a kid and I first witnessed the launch of the video game, Super Mario Bros, on my friend’s television.

There were the obstacles, jumps, magic mushrooms and endless falls down bottomless pits. Quirky sounds triumphantly proclaimed growth, progress and victory. Goals were signalled with flags and celebrated with fireworks.

Enemies came from below and above. Each one had its own personality. You could jump on the head of one enemy and squash it but the next one would be covered in spikes. Another would duck its head in its shell, which would then ricochet off a wall and return to knock you over.

Kids all over the world were hurling their controllers around the room desperately trying to get this tiny little Mario sprite to reach the goal of rescuing the princess. Nobody really cared who the princess was or why she was even captured in the first place.

There was no genuine reward other than the pure satisfaction of getting to the end. Forget all the textbooks on motivation. Nintendo had captured it in a bottle, like lightning.


Gamification…not that gimmick again!

Let’s jump forward say, 20 years or so. The term ‘gamification’ took hold and spawned some innovative ‘game-based’ problem-solving approaches as well as setting millions of eyes rolling. It was gimmicky, like a typical management fad and seemed to trivialise our important day-to-day jobs.

My eyes weren’t rolling, though. The eight-year-old in me was grinning. I suspect I wasn’t the only one. Somewhere between adolescence and adulthood—whenever that transition is finalised—we all shift from embracing fun to becoming very serious about work. Work isn’t a video game. It’s business. And business is a serious affair, Dr Duck.

For those of you unfamiliar with gamification, the idea was to use the elements that make a video game so engaging and apply them to the way we go about work. The best gamification has already been applied without you realising it. There are the subtle movements and sounds your phone makes when you activate it. The various apps you use have adopted gamification principles, like including avatars, scores and rating systems.

Fortunately, gamification doesn’t have to be a management fad in my line of work. As a psychologist, I became curious as to the underlying mechanisms that make video games so engaging. Here are a few observations:


Meaningless scores and progress

Video games are addictive because they provide an ongoing sense of progress. I remember adults observing Super Mario when I was a child. They seemed to link the objective of the game to the score in the corner.

The score, however, was never the goal. Unlike earlier video games, like Space Invaders, where high scores were presented on a screen, in Super Mario Bros the score was never compared to other users. It was the mere feeling of progress that was motivating.

Adults filtered the goal of the game with their own orderly logic. There had to be some reward in reaching the end. However, like any good job, the work in of itself was the reward.


Power mushroom sounds

In a recent job I was informed by the IT professionals that office computers shouldn’t have sounds as they are distracting. No doubt this was correct but I grieved the lost possibility of using sound as subtle motivator.

Think about how often sound enriches our experience. There’s the sound of unwrapping a present, the crunch of fresh popcorn at the cinema, the satisfying click of the mouse, the music the pumps through your headphones on the train or when you go for a run.

Think of how much less impact a film like Star Wars would have without the blaring themes of John Williams or Darth Vader’s creepy breathing.

Super Mario Bros was known for its joyful tune as well as little blasts of sound effects for everything you did. Grab a mushroom and the game makes a satisfying sound signifying augmentation. Get hit by a bad guy and the music makes noise representing sorrow and misfortune.

The sounds are like a commentary on the drama and reinforce positive performance.


The bottomless pit learning curves

Super Mario teaches us a lot about learning too. When you first play the game, you die…a lot. It’s annoying but with every new try, you make it a little bit further and there are milestones that help you on the way. Doesn’t that sound like how a workplace should function?

Unfortunately, with most workplaces, we hire ‘qualified’ and ‘competent’ people so we don’t have to go through all that. Human beings are sometimes treated like assets that are installed and then simply operate as per specification.

Imagine what our environments would be like if they were designed to allow people to make lots of mistakes so they could upskill and learn? Think about how you really learn. It’s usually through experimentation, trial and error and asking people. How many workplaces embrace, let alone tolerate, errors?


Nintendo Controller Simplicity

It’s often assumed that when we introduce a new system or procedure, we need to train people and give them documents. This, to me, is a sign we probably haven’t designed the new solution to be as simple as it needs to be.

I recently overheard a conversation in a workplace where someone said, ‘I feel like we are designing everything around human error and that’s just not right.’ I resisted the urge to butt in and say, ‘Yes it is!’ Design is everything.

When I played Super Mario for the first time it was simple. The controller had a few buttons, clearly labelled and designed for your thumbs. You pressed start and off you went, learning along the way.

When was the last time you used a workplace system that worked as well? It was probably your Smart Phone, which was designed with the same mentality as a video game.

This frustrated employee didn’t like the idea of continually designing the system to work around the quirks and limitations of people. People needed to work around the system.

But that’s why video games are so much fun. You aren’t spending all your time trying to work out how to play. Someone’s already spent the time working that out for you. You just start playing.


‘Your princess is in another castle’ humour

What makes something funny? It’s when we expect an outcome but are surprised by an alternative. Video games are often surprising and have a good sense of humour.

Super Mario has various castles to conquer and when you reach the end, you are informed, ‘The princess is in another castle.’ The anti-climax is amusing and triggered many kids to scream and laugh at the television with frustration. Get to the end of the entire game and the princess says ‘…but our princess is in another castle…just kidding.’The developers had fun making this and have designed it so you will have fun too.

In workplaces, we are careful to strip out the jokes and humour from the products and solutions we develop. Sure, we make jokes along the way and have fun. But why do we want to sanitise our documents, systems and surroundings from good old fashioned fun? When was the last time you read a communication from an executive or CEO that wasn’t carefully crafted and devoid of any humour?

Gamification re-introduced some of these ‘fun’ elements to work. When it is done well, the fun and gaming elements are integrated seamlessly. When it’s done badly, it results in gimmicky trophies, medals and scores being slapped on a dashboard. As with any workplace initiative, gamification also needs a lot of attention and effort to make it work. Humour can be a part of a solution. It just needs to be done well.


Pokemon Go…back to work

I’ve never really liked the term work-life balance. It implies work is something we have to do so we can enjoy our real lives.

People like to quarantine fun and work. Video games are fun and need to be limited. Growing up, we had time limits on how long we could play a game. After all, the game was robbing my time that could have been better spent on more important stuff like exercise and school. Who would have thought that as an adult I would be able to use and apply all those wasted hours on Super Mario?

Today, I’ve noticed the same fear of smart phones and tablets. There was the probably the same fear of television and no doubt radios and story books. Recently, there was world-wide enthusiasm as well as condemnation of Pokemon Go. Although I didn’t jump on either bandwagon my only thought on it was the eight-year-old in me—‘That looks like fun.’

Meanwhile, I watch as my daughters learn from YouTube and effortlessly navigate their Ipad. They’ve learned to create incredible playdough, beautiful artwork and craft from the online media.

Like video games, I’m not so fearful that they are wasting their time. I am more curious as to how all of these amazing technologies will be further integrated into our lives in the future. My eldest has already started to ask me to show her how to create drawings using the computer.

The technology isn’t a distraction. It’s progress.

Thank you for reading this blog but my insights are in another castle…just kidding.

Seven Dwarf leadership styles. Which one are you?

in Editor Pick/Work Psychology by

My first job ended with a triumphant walk out. I threw my Safeway name badge on the ground and never returned (except to buy things later).

This was not to be a trend in my career nor was it a sign of my immature youth. It was in response to a manager who lost his temper and decided to grab me by the arm, drag me across the store and berate me in front of customers. Let’s call him Grumpy.

Grumpy had an up and down personality. He was volatile one day and gregarious the next. Research suggests that an unpredictable personality is worse than someone who’s just difficult or unfair all the time. Uncertainty is the best friend of anxiety and worry. At least with a bad tempered person, you know you’ll be unhappy.

On this particular occasion, he was grumpy because someone had been stealing all the painkillers from Aisle One. When I approached him, he was standing in the corner of the aisle on a covert operation to catch the thief red handed.

Apparently, when I interrupted the critical mission, he thought it was appropriate to drag me across the store and give me a dressing down in front of the customers. I walked away, a little shook-up and a bit angry and proceeded to go back to work.

Grumpy wandered by a few minutes later with a jovial smile. ‘False alarm’, he said.

It was only when a younger colleague joked about the incident and Grumpy’s temper, that all the lightbulbs in Aisle four went off. ‘That wasn’t right!’, said a young pre-doctor/pre-psychologist (i.e. me). The badge fell to the ground and I only ever returned to stock up on bread and milk.

Then the phone calls came through. Grumpy was terrified I’d report him to Safeway management because he was under probation for sexually harassing a female colleague at another store.

In retrospect, it may have been appropriate to report him but, like most people, you just want to move on to something new and forget the past.

The only benefit of being man handled and embarrassed was Grumpy gave me a glowing recommendation when I applied for my next job.

It would be easy to think that Grumpy was the exception. With all the managers—senior and otherwise—across the globe, true leadership is a pretty important but is hard to find. Here are a few leadership styles that I’ve observed.


Sleepy leaders are those that are essentially asleep at the wheel as the workplace and world around them changes. They are personified by the worst kind of decision—indecision. Ideas are brought to them to improve their business and they fail to see the potential. Poor performers pass under their radar and may even be promoted. The sleepy leader is uninvolved and inspires apathy from their followers.


Sneezy represents the distracted leader who becomes so preoccupied with their immediate circumstances they are as effective as someone having a sneezing fit. I remember one leader who just couldn’t sit still in a meeting to hear a briefing. He’d wander around the room, interrupt you with side stories and even massage your shoulders. I used to liken it to trying to have a discussion whilst someone is juggling and swallowing swords in front of you.


The bashful leader is simply lacking self-confidence and steel. I worked with a colleague who felt deeply uncomfortable when their manager confided in them about how they didn’t feel like they could lead. This manager would worry, feel ineffective when they made decisions and were concerned that their team didn’t respect them. Every leader has doubts, nerves, and fears. A leader should be self-aware and honest but, let’s face it, we don’t want to work for someone who doubts themselves all the time.


The dopey leader simply makes poor decisions or does not have the subject matter expertise to have an educated opinion. I recall a manager who was facilitating a workshop after a major safety incident. The manager commanded the room and started writing a list of punitive and ineffective actions on the whiteboard. They were commanding from a place of ignorance. A sensible leader needs to defer to the experts and facilitate. A dopey leader makes the decisions from a place of complete ignorance.


Everyone loves the happy leader who inspires laughter and fun in the workplace. At best, these leaders can help motivate and promote a positive culture. At worst, however, they may not always be realistic and can even side-step issues that drain their energy levels. When I worked in the public service, I observed many a happy leader worn down over time by their worried, more conservative colleagues who wanted to tackle the difficult issues. They would sometimes joke or make light of a situation as their concerned counterpart was more interested in getting an outcome than feeling good about it.


Then there’s Doc, the natural leader. They don’t necessarily have any particular characteristics that stand out other than the fact that everyone listens and follows them. Workplace psychologists have long studied the various traits, styles, motivations, and thinking that goes into a ‘Doc’.

Docs don’t worry too much but worry just enough. They’re happy enough but happiness isn’t their priority. They’ve got the smarts but rely on their peers as well. They take action and have the guts to do the job without being too overconfident.

Oh, and they usually don’t man handle their employees.

1 2 3 6
Go to Top