I'm new here and i've been reading through quite a few threads/posts and i've noticed that there's something common in people who receive medical help, which is that their docotors don't help or make things worse. This makes me worried because i've been considering getting medical help but it appears that this could be pointless or even a bad move.
Is there anyone here who has had a positive experience with their doctor and feels that they have made you feel better?
Is there anyone here who has had a positive experience with their doctor and feels that they have made you feel better?