I’ve heard it said and, for the most part I’ve observed that the church is always behind the culture at least a few years and I wonder if we’re not behind in our respect for women. For years and years women have held a subordinate standing in culture and it seems to me that we catching ourselves. We’re coming closer and closer to holding women in their correct place that is as an equal opposite of man—bone of his bone and flesh of his flesh. But why is it that some churches still holds women back from the pulpit? Is it really because they actually think the Bible tells them to? And if so why are women still allowed to teach Sunday school? Or could it be that we’re just behind the times again?