I dont want to die, but I dont want to live in this world. I feel like Im constantly bombarded with news of all the bad in the world and its depressing. Every time the news is on all I hear is tragedies, people dying who shouldn't, children raped, a man hung a dog with a chain then beat it death for heavens sake! What is wrong with people? I'm in nursing school and i keep hearing about health insurance companies and how they just let people die instead of covering them. Like this 17 year old girl whose health care company refused to pay for a liver transplant claiming it was an "experimental procedure" so she died. I want to be a nurse to help people and im scared i will have patients that I have to watch die and i cant help because of money. thats just disgusting. Why are we letting our world crumble? I need to hear some good news, some hope that this world isnt just doomed. I want a baby and a family in the future but I wonder about bringing a child into this world. I want things to be better!