Hi, so heres the thing.. Iv been going through a lot lately, i recently started injuring and i feel like now that i’v gotten help, everything has gotten 100 X’s worse, i feel awful about myself and i feel more depressed than ever.. I didn’t feel like this when i did it. I don’t understand why i do now, i’m really regretting telling my family. Now i feel so depressed and alone.. I can barley get through the day. & no matter how i seem to tell people they never truly seem to understand the hurt i’m going through.. i think i’v lost all hope.