It's never occurred to me in my life to refer to women as "females" until Gen Z, younger Millenials, and a bunch of purple-haired, liberal arts college professors decided that the word "woman" means everything and therefore nothing.
The word "female" means something unless you're dealing with people who have gone completely off their fucking rockers and insist that biological sex is somehow fluid, or a social construct, or some bullshit aside from the biological fact that it is.
Still, going around referring to women as "females" carries a bad context that I do understand conceptually.