This isn't an actual response to a post but a topic I'd like to talk about. I've been told my whole life that "things will get better. Don't worry." When I was really young starting at around five I had depression. When I was 13 and started cutting myself I was told that things will get better. At 15 when I entered high school and things got worse I was told things will get better. When I was 18 and I told people I was cutting again I was told things will get better. At 21 things were only worse but now I can drink to help with that. I was told things would get better. At 24 when I went through a divorce I was told things would get better. Now I'm 25 and things haven't gotten better. Things have only proceeded to get worse. Now I'm at an age where people stop saying " things will get better". I've stopped talking to people for the most part and over the last two months I've closed myself off from people who were friends. I don't mind being alone and to be honest I would rather be alone than with shitty people. Aside from my brother I don't know of anyone really who I care to even be around. I don't like my job. I'm in tons of debt and am still years and even more debt away from getting a degree that will only make slightly more than I"m making now. I've had to move back in with my parents. My job is dead end with no other opportunities around for better employment. My health is no longer in good condition. I'm not saying my life is the worst life on the planet. I'm not saying that there isn't positive points about my life but as a genuine and real question is it going to get better? I think that its a bit of an empty statement that is said to someone who is in an emotionally volatile point in their life where things actually will be better. But the state of my life I don't see any evidence of it improving. If that really is the case then I don't really feel like going through another 30-50 years of it.