i feel like society is literally gaslighting us, for YEARS it was hating on anything women did or liked. pop music? terrible. staying home? easy. “go make me a sandwich” now all of a sudden it’s “you’re most valuable embracing your feminine ✨✨”
oh f off