I used to view solitude as a painful experience, one that should be avoided at all costs. Yet, having relationships with women after some bad experiences in the past gave me nightmares, and socializing with friends gave me neither reassurance nor mutual understanding. Amusingly enough, I found out during a bout of depression over the past few years, that the more isolated from other people I am, the more content I become with my life. To be honest, I just feel annoyed, when my therapists keep reiterating how important and what a blessing it is too get back into the shallow hugfest of pseudo-empathy that society offers. Anyone sharing these thoughts?