How did women become nurses and teachers in the early 20th century?
There were teachers colleges and nursing training around since the very late 1800s, so by the 20th century it was basically … “going to school” to be trained. As for nursing, some hospitals had their own training courses so “college” wasn’t really necessary. MANY districts required a “teaching certificate” for you to be a teacher even in the 1800s, so the idea that “a woman only had to be able to write her name and do math” is a false statement. Todays “re-write of history” by the politically correct would have you believeing that women were “slaves of the men” until the 1960s. This is hogwash to convert people to the “everybody is a victim” school of thought.