Sorry! What I'm talking about is schools that follow Christian values and ideals, and are stricter about that sort of thing. Georgetown, etc. are religiously affiliated, but have strayed from their Christian roots and become much more secular.
What I'm asking for is the top colleges academically that adhere to Christian values.
Hope this was less confusing!