<p>Some research on womens colleges includes findings that these colleges encourage leadership skills in women, provide women with more female role models, and that they encourage women to focus on traditionally male-dominated fields of study. </p>
<p>Here are a few reasons why this is true.</p>