Well the dictionary definition is a social system and/or government where men hold the power and women are largerly excluded. While here in the US there has been several strides towards gender equality there are still a lot of deeply rooted social structures and norms that favor men. So that's why it's still deemed a patriarchy. I highly recommend doing your own research though about if you want to learn more.