Dishonesty Will Destroy Your Reliability Efforts

Joseph Bironas
13 min readMay 11, 2021

(This article is 3 of 4 in a series about cultural issues in reliability and software engineering. Be sure to also check out parts 1, 2, and 4)

Honesty and trust are critical prerequisites to reliability. When trust is broken, the most natural outcomes are information hiding, and fear of retribution, which are the natural enemies of a blameless postmortem culture and the high-trust candor required to create high-quality user experiences in complex problem domains.

Photo by John Dean on Unsplash

A little background

Dishonesty is often a natural reaction in some situations. People will justify the little white lie to protect someone’s feelings, or prevent giving bad news because of various personal fears. I remember many years ago I was in charge of a project to tear down a cluster, do a hardware refresh, and turn the cluster back up. It had never been done before, and I was very new at managing such projects. Because of my technical background, I was confident in when and where shortcuts could be taken. But when it came down to critical moments, my plans failed. I had to rely on the hard work and extra effort of our hardware operations program management and team to repair the broken plan I’d given them. In the end things turned out well, but it was not without a lot of pain for a significant number of people. I tried to spin the outcome as more successful than it was, out of fear of retribution and failure in the face of what was a new direction for me. I was prone to using language that would minimize problems rather than present them honestly. That was a hard habit to break, but not intentionally malicious. I hated being the bearer of bad news. Later I was told I should go meet and work with the people that had to clean up my mistakes. I was told it was to improve the process, but it quickly turned into a listening and apology tour.

I’m grateful for that experience. The truth came out, but the effect was strained relationships with good people. What helped was accepting my responsibility, recognizing my proclivities in the moment, apologizing, and making genuine efforts to repair the trust and relationships harmed by my mistakes. I also had to work with my leadership on dealing with the consequences caused by my dishonesty of minimizing or overinflating impact.

Whether you’re aware of it or not, dishonesty has probably touched you at work. You’ve either instigated it, or been affected by it. Regardless of the severity, if you want high quality results for your users, you have to build a high-trust psychologically safe culture — for yourself, your team, and your organization.

Dishonesty in the Workplace

From the exaggeration of qualifications on resumes (not everyone does this, I’ve worked at places where they would terminate you if they found out you’d lied on your resume), to taking sick days when you’re really healthy (this I’ve witnessed firsthand, it just made me think less of people since I generally say yes to all vacation requests). It gets downright pathological though, people take credit for other people’s work, or they undermine people they don’t like or with whom they are in competition. The truth is, people lie in the workplace. But the truth always comes out in time.

There are good odds you are not encountering a frequent or compulsive liar. Applying Pareto’s principle, we can assert that 20% of people tell 80% of the lies, while 80% of people tell the remaining 20% of lies. By this reasoning, frequent liars are rare and tend to have a weak moral compass or see nothing wrong with dishonesty. You can see the impact on the people around them. Their personal relationships tend to suffer. Psychologists say they tend to lie because they enjoy it rather than to manipulate or gain control.

If the chances are low you’re dealing with a frequent liar, then you’re probably dealing with an opportunistic liar. While we may never fully understand someone’s deeply personal motivations, there are signals and clues to help understand what you’re dealing with if you know what to look for.

Fundamental Attribution Error

“It is difficult to get a man to understand something when his salary depends upon his not understanding it.” — Upton Sinclair

Photo by Ankush Minda on Unsplash

We often assume the person is of weak moral character and blame them individually if we discover they are being dishonest. Daniel Ariely, a behavioral economist at Duke University, in the book “The (Honest) Truth About Dishonesty” determined that environment matters when people make the decision to lie or not. He and his team found that people lied regardless of the reward at stake or the risk of getting caught. Researchers found several factors that increase the likelihood of dishonesty. Looking at these factors through the lens of the Tech industry there are some easy to spot trends. (headings paraphrased from wikipedia):

A high degree of personal creativity and imagination.

As tech leaders, we hire for high levels of creativity and technical skill. We explicitly screen for these attributes during the interview process.

Conflicts of interest.

More often than not there are conflicts of interest where our intents and business realities don’t align. Some examples are, when there are incentives to hire fast instead of hire well, or failing to investigate a subordinate or coworker’s wrongdoing because they are a friend (favoritism).

Having personally lied before. Witnessing dishonest behaviors from others. A culture that provides examples of dishonesty.

It’s rare to find someone who has never lied. The bar for dishonesty is lowered for people who have been dishonest in the past when exposed to other environmental factors. When people see others lying and getting away with it in reviews, or in escalations, these experiences can reduce ethical frictions to not only their future behavior, but others as well.

Thinking others benefit from our personal dishonesty. The ability to rationalize dishonesty.

Tech in the Steve Jobs era and beyond has been a cult of personality that revels in rationalizing bad behavior. As employees at these companies, we frequently talk ourselves into believing what we are doing is for the good of the business, or the good of the world regardless of how disruptive, or unethical certain decisions may be.

High sustained stress, fatigue, or burnout.

Tech workers frequently have high sustained levels of stress, fatigue, or burnout either due to poorly scoped plans with tight deadlines, or interrupt driven work that keeps people on long schedules with late nights. Tales of burnout and stress related medical conditions in tech are common.

Often dishonesty comes from someone’s fear or insecurity about how they’ll be perceived. Imposter syndrome is also rampant in the tech industry, and could play a part. Dishonesty can be situational in an attempt to protect image or ego, or somehow as a response to these environmental factors. Is the individual still responsible for the dishonesty? Yes. But should the environment be held accountable for dishonesty? Absolutely.

Some Practical Examples

Photo by NeONBRAND on Unsplash

Here are a couple of situations I’ve personally experienced. The goal of these examples is to provide a bit of range between mild to severe forms of dishonesty, and show some of the ways you might approach understanding the motivations of the person being dishonest and what your options are.

In both examples, lying is difficult to prove which is frequently true. It’s easy for situations like these to devolve into one story against the other (aka “he said/she said” arguments) which are difficult to adjudicate for leadership and human resources departments. Also of note, the dishonest person is also in a position of power, and that power dynamic will impact organizational trust. It’s difficult to be prescriptive about what to do in these situations because everyone is dealing with their own unique circumstances, but for reliability engineering to be effective, organizational trust is a critical prerequisite.

Scenario 1

While interviewing, you’re told about the position as being one of a DevOps transformation. Once you’ve accepted the offer however, you discover that the hiring manager is not interested in creating a devops structure. They are creating parallel development and operations teams and want you to manage the operations team.

DevOps transformations typically combine Dev and Ops roles in one team to create tighter communications loops and real empathy between the people who write and the people who operate software systems. This model is proven to improve developer happiness and productivity, and build more reliable outcomes for software quality and ultimately users. Having a hiring manager tell you the role includes transformation is not uncommon in the Reliability Engineering or DevOps space. Before you can be sure someone was being deliberately dishonest, you have to assume good intent. The hiring manager may not understand the last 20 years of research in software development and operations management. Talk through the plans and vision with the hiring manager, and see if they are open to establishing a common ground. Try to understand how well they understand what they want and what they need to solve the problems ahead. If they persist in driving their vision and aren’t open to collaboration on adjustments to that vision, you are likely in a situation where the person was deliberately using dishonesty to “close the hire” so to speak which points at other potential problems.

You can also look for other environmental clues. How are PeopleOps functions staffed? If these areas like HR Business Partners or Recruiting and Sourcing are under invested, a lot of pressure and responsibility will be placed on engineering managers to make up the difference. Companies will even extol these lean teams as a virtue, but they increase stress and pressure in ways that incentivize dishonesty. Without proper operational support from teams like PeopleOps, engineering managers are left to understand and perform processes for which they are not experts. It’s not the best use of their time. To put it another way, not only are they not doing their job well, they are doing other jobs poorly as well. Often managers in these environments will be measured by how many hires they make which creates an incentive to hire quickly rather than hire for skill and culture fit. Perverse incentives are guaranteed to cause team and organizational issues.

Scenario 2

Your manager gives you a directive in a one-on-one situation. That directive winds up impacting you negatively and when you remind your manager what they said, they deny it. On top of this, they’ve written it down differently in a document.

First and foremost, take your own notes. Memorialize these conversations so you are not at the mercy of the only documented source later. To be clear, this still might not lead to the outcome you’d hope, but you don’t want the only recorded document to be the other person’s perspective.

It’s still worth giving leadership the benefit of the doubt. As a leader, you have to keep a lot of context in your head, and frequently make decisions on the best information you have at the moment. In dynamic or complex situations, you might be making multiple decisions rapidly that seem to contradict with each other. Do these qualify as lies? I don’t think so. Good leadership can avoid these situations when they have well defined objectives and principles. But everyone deserves the opportunity to learn from their mistakes.

In this example, however, the manager is gaslighting. Gaslighting can’t be explained away the same way a lie can. When someone is gaslighting you, there’s an element of domination and control present that is usually employed by narcissists, dictators, or even cult leaders. The goal may be to save face, but they are trying to get you to doubt your own perception. Once this pattern starts, it’s unlikely to stop and will likely only get worse as the person’s narcissistic and dominating behavior intensifies over time. These people can be powerful toxins in a company culture if not identified and dealt with swiftly. Which begs the question, why aren’t these people identified and dealt with quickly? Often because the system of incentives rewards other behaviors they may exhibit, such as making fast progress on projects regardless of the body count left in their wake.

Remedies

Photo by Jon Tyson on Unsplash

Leaders need to exemplify honesty

Leadership has a key role in setting the tone for an honest culture. Both in establishing the rules of engagement around honesty, and setting clear expectations, but also in dealing with dishonest acts. When it’s discovered that there’s dishonesty in an organization it’s critical that leadership reiterates principles of honesty and ensures that dishonesty is held accountable so that it doesn’t spread through the organization.

To move past organizational dishonesty, leadership has to set the example and take concrete and visible steps to disincentivize dishonesty. One cheap and easy way is to send regular reminders that the organization values honesty. Research suggests that this simple step will improve organizational honesty.

Leaders need to visibly practice and regularly encourage inclusivity and psychological safety to counter the effects of imposter syndrome and other environmental incentives to be dishonest.

Approach Dishonesty with Curiosity

Leaders should approach dishonesty with curiosity to understand why a person is motivated to lie. What environmental factors contributed to lying, and how can those environmental factors be removed? They should watch for and be aware of any conflicts of interest, incentives or situations that create high levels of sustained stress, anxiety, or fear, etc. — and remove them everywhere possible.

Environmental motivators for dishonesty are common in software engineering organizations. Departments or teams that maintain high levels of stress, fatigue, or high degrees of fear have higher degrees of dishonesty. Make sure that functions supporting engineering efforts such as program management, recruiting and sourcing, and PeopleOps have the required staffing to prevent overloading leaders and employees, and reduce their effectiveness by not allowing them to focus on their specialization.

Deal with patterns of dishonesty swiftly

If environmental factors don’t seem to be a cause, and there’s an established pattern of dishonesty, leaders need to remove that person from their team regardless of their perceived influence and impact. If you haven’t established a pattern yet, work with your PeopleOps team to note the behavior and send the appropriate messaging to the person exhibiting dishonesty. Dishonesty severely harms the culture of trust and honesty required in complex problem areas such as software engineering. Remove patterns of dishonesty from the organization before they cause too much harm if you have to. These cases can be difficult to prove, so collect evidence from multiple sources if possible to avoid common he said/she said arguments.

Individuals should leverage their leadership chains and PeopleOps to ensure the dishonesty is captured to help establish patterns. If you can’t trust your direct manager, consult with their manager. Talk to PeopleOps. Have your documentation of events and discuss what happened as objectively as possible. Acknowledge that your intent is only to help and support the person, and hopefully prevent it from becoming a pattern.

If all else fails

Have a good sense of your own values, ethics, and principles so that you have a firm understanding of what you are and are not willing to tolerate. If the situation you’re dealing with has pushed you past your limits, and you can’t influence the organization or you recognize a fundamental misalignment of values, you can “vote with your feet” and leave the situation. This may not be possible for everyone, but if you’re an engineer or engineering leader, there’s a good chance there are companies out there that are more suitable, or at least won’t destroy your soul by making you constantly compromise your values.

And lastly, before you accept a new position, you can ask questions of your interviewers, such as “how do you incentivize honesty in your workplace” or “have you ever had to deal with dishonesty in the workplace? What did you do?” Hiring managers and leaders with experience in such issues will have good multifaceted answers to these questions.

Conclusions

Photo by Nick Fewings on Unsplash

An environment is ripe for dishonesty when there are cultural incentives like stress, pressure, and burnout, as well as cultural examples that may be witnessed openly. If your leadership is willing to be dishonest, it might be situational and temporary, but it’s possible that the situation is more persistent than anyone is comfortable admitting, which leads to toxicity and cultural erosion.

Honesty is critical for functions such as reliability engineering because when SREs act out of fear due to institutional frictions or politics, there’s high risk of information hiding which will lead to systemic problems going unaddressed. Secrecy and information hiding ultimately creates a poor customer experience. Building a culture of honesty is a critical prerequisite to having psychological safety and blamelessness that builds more robust systems, a continuously learning culture, and higher quality user experiences.

As an individual manager or software engineer in a position of experiencing dishonesty, my hope is that you have situational awareness and can make appropriate decisions for yourself. If you’re being lied to and/or gaslit you still have agency and autonomy. You can either accept the behavior — understanding the risks that acceptance brings, ie, tolerating toxic people and actions, you are more likely to be dishonest, and the culture is more likely to suffer. Or you can change your situation in some way — change your environment, or remove yourself from it. There may be substantial friction to either direction, but the decision is ultimately your own.

References

--

--

Joseph Bironas

I'm an engineering leader who is passionate about reliability engineering and building sustainable/scalable teams.