Feeling bad about not telling the truth, because the truth in some situations will make things worse.
What I mean is when a coworker asks you what you're doing for Thanksgiving or Xmas(already assuming that you celebrate them), and you just tell them some bs to get them off your back and avoid an uncomfortable situation, rather than the truth which is I don't celebrate and I'll be completely alone, just like any other day of the year. Which I'm fine with, but will be treated like a freak if I don't answer the way they're expecting. This has to do with the whole, "Why do you ask how I'm doing if you don't really care, or only want to hear something positive?" question.
I also ran into my ex boss recently who was a SOB, but I put on the fake smile just to make the situation go away as quickly as possible. This was actually at my current job. What was I supposed to do, say, "F U turd, I hope you rot in hell for all the mental anguish you caused to me"? You can't say that, especially on the clock. I just hate in life it seems that it's easier or necessary to lie sometimes instead of being honest. I also wonder, how long could I actually last in the world if I told the truth all the time, in all situations? I don't think very long... I'm forced to lie in order to survive, and I hate that