There is a panchatantra (old moral stories of Indian origin, for children) tale of how a Brahmin(a person belonging to a high class of Hindu priests) taking a goat along with him gets fooled by a set of swindlers working together. Each one of them encounters the Brahmin taking turns and tell him that what he is taking along is a street dog and not a goat and this is insulting to his honor. They do it meticulously until he falls for the trap and lets go of the goat.
When I first came across the story as a kid, I couldn’t understand the stupidity of the Brahmin, nor the moral of the story – “how could someone be so foolish that on hearing a repeated bluff, one falls for it even at the cost of sacrificing one’s own judgement”.
It wasn’t clear to me at that time, that there is always an element of bargain and relative prioritization constantly taxing the mind, that makes it comfortable to believe (untruth) and join the band wagon of mass appeal. For one it is rather uncomfortable and time consuming to take the route of objective evaluation/getting down to the detail (often standing alone) and discover the truth for oneself. On the other hand, the aspect of – “What if I stand alone with this and end up being a fool?”, is a tough bargain to face vis-a-vis “these are my people and hence they may be telling me the truth…”
And fast forward several years into the real world of “bluff by design”, this story makes phenomenal sense. The intent is not to state that people bluff on purpose, but over a prolonged period of either “consciously allowing deviations” (from what is supposed to be the right way of doing something) or “consciously overlooking deviations” (under the pretext of ‘its fine/good enough for now’) “bluff by design” creeps into the system where perceptions take over reality and “genuine goats start to appear like dogs” and vice versa – with all due respect to both the cohabitants of the planet.
In standard terms, I am referring to the “errors of commission” and “errors of omission” that creep in for lack of someone (or an instituted approach) constantly getting down to the details, challenging the prima facie and attempting to discover a better insight.
And then there are jargons/newer interventions of technology/methodology, which add as catalysts to chaos (in an environment of “bluff by design” only and not in an ideal world!) in the absence of – 1. CMEs (context matter experts) who could unravel the current bluffyness and look beyond the mist, and 2. Good managers/drivers that drive execution with result orientation.
If you come across some of the following or similar…
– repeated complaints/strong opinions against someone or something by people without the right background/experience, objectivity or big picture
– a lot of green weekly reports leading to a “sudden” and “unforeseen” situation that is making the project crawl to a halt
– a lot of “leadership vision”/”marketing talk” about how a new technology/player is going to solve the world problems for you without a plan of execution/proof points that takes current context into account
…then depending on what role you play, you may be gently being subjected to/subjecting someone else to the “genuine goat syndrome” (GGS). These behaviors are not always backed by malicious intent(s), but usually settle down in the system due to lack of constant pruning/coaching/questioning as counter behaviors. By asking next level questions and helping the system (a person or a team) gently take the painful path towards discovering the detail, GGS can be delayed or avoided. More so if you are in a position of power (possessing a goat that is desirable, moral-tale-fully speaking) the propensity of those around you to subject you to GGS is quite high.
Some of the major transformation/change management debacles in the industry occurred with someone (a decision maker) not doing enough due diligence and falling prey to GGS from a product vendor/marketeer for instance (“what you have is wrong, I will give you the right one”) at a key decision point and leading the entire organization on a wrong path. And then this goes on in the organization as a chain effect – “if my boss believes that it is a dog, it must be so…” and then the entire organization runs with making the decision right, instead of questioning the hypothesis.
So for what it’s worth, some of the following might be worth consideration in helping delay GGS and preventing wrong decisions
- Surrounding oneself with competent people that understand the big picture
- Validating the competence/experience of the one making the claim of “hey! it’s not a goat, it’s a dog!”
- Asking for proof points, questions, counter arguments, alternatives
- Experimenting in a controlled/contained environment before taking the big leap
- Sticking to fundamentals/company values (which begs the necessity of being clear with what ones fundamentals should be in a given role)
Each time a GGS is overcome with right counter (and balancing) behaviors, it not only prevents making grave and often time consuming mistakes, but also sets the entire organization towards absorbing a positive and more resilient culture.