Helen Andrews argues woke culture is the inevitable result of women taking over pivotal industries such as law, media, and medicine.
—
Helen Andrews argues woke culture is the inevitable result of women taking over pivotal industries such as law, media, and medicine.


