White women still acting like every adult is comfortable teaching biology to their kids versus the bullshit bible blinders we wear in in most ignorant ass American households.
Go ahead and take sex education out of elementary schools if you want to. Girls are menstruating earlier and earlier. Boys are soft as fuck can can't fight. More religion in schools will lead to more pedophilia rape and more kids that can't even explain what the fuck happened because they can't say penis, vagina or anus. Mix that with no abortion, seems like a bad idea for all this parental rights shit.