I'm currently a freshmen in college pursuing an English B.A. However, I've had my doubts on whether I would be able to come out making a decent salary, or even find a job for that matter. I've heard countless stories about English majors coming out with nothing to do, but teach. I don't really mind teaching; however, I would prefer to teach at a university, in which case, I would need to continue my studies past the undergraduate level. Is it worth going for an English degree? Are job chances really that slim and salaries disparagingly low? What other jobs can one do besides teach?