We define “rape culture” as the culture in which we live in that normalizes and glorifies sexualized violence, thereby creating a sense of entitlement to other people's physical, emotional, and sexual beings without consent. This culture is upheld by many different things.