How do you get your professional self esteem back?
As a young woman, I joined the workforce about 2 years ago in Miami, FL. Job after job, I have dealt with misogynistic bosses, sexist coworkers, and have been treated as frankly less than human (cheated out of money I was promised, lied to about bonuses, promotions etc). I am not in a financially comfortable position, and am not able to legally fight back: which my male bosses have known, and used to their advantage. This has left me feeling weak, powerless, and like I will never be successful in life. I am switching careers to work for myself, but over the past 2 years, I have lost all of my professional self esteem. It almost feels as though I have been so accustomed to being treated this way, that I do not know what a healthy workplace is supposed to be like (some quick examples: one male boss would knock on the door of the restroom, screaming that I had to deal with clients. Same boss called me a r*tard*d b*tch in front of other employees. I quit that, moved to what I thought was a better company, and somehow ended up in a worse situation. My boss constantly commented on how I needed to “tone my body” and how clients would “love it.”). I am in the real estate industry.
Does anyone have advice? I am so angry at myself for allowing that kind of treatment, but now I feel worthless and stuck. Any advice would be appreciated.