<p>I’m a senior in high school going into nursing next year, and I hear the exact same thing all the time! One girl even asked me, “do you even have to go to college to be a nurse?” I was absolutely shocked! Although I wouldn’t put too much stock in what that particular girl thinks, because she’s going to be majoring in international relations, and last month she asked me why the US just didn’t take over Africa and make everything better. hmmm…</p>
<p>I do my best to tell people about nursing instead of getting offended though. When they ask, “why don’t you just be a doctor,” I simply tell them, “because I don’t want to be a doctor. I want to be a nurse,” or “nursing is a better fit for me.” Whenever they make completely uninformed statements about nursing I correct them gently. I have to remember that I’m not well informed on a lot of other academic programs for other majors and I don’t know much about other careers, so I can’t judge other people for not knowing much about nursing. Fortunately, my mom is an RN and my dad is a doctor, and they are both very supportive of my decision to become a nurse</p>