Do companies let you look at new hire documents (docs you have to sign to be an employee) BEFORE you accept the job offer? In my past, a small private company —on my first day—HR asked me to sign an agreement not to sue if I was harassed at work and forced me to sign an arbitration agreement if I took any issue with the company. Sure enough—I found out HR made new employees sign these docs because of one asshole—the owner of the company, who was a giant man baby and misogynistic pig would openly wig out on people, especially timid women — you could see he enjoyed their fear. Ever since then, never worked for another privately owned company again. The owners really do feel you are their property. But if I had known about the documents, I would’ve NEVER accepted the offer.