World War II was the decisive event that advanced women from their traditionally subservient, predominantly domestic role in society, into their dramatically more even modern role. (It is true that during the immediate postwar period women were expected to return to...

read more