Foreign languages don't change and they don't get more complicated. Even if they do, it happens over such a long period of time that it's not noticeable.
Yeah, but that doesn't matter. Sure, foreign languages don't change, but that doesn't mean that there isn't a massive amount to learn in order for one to become even passable at it (which is not even close to being fluent).
Look, getting a bachelor's degree in a foreign language, even if you get straight A's from a top program, still doesn't matter that you will feel truly comfortable using it if you then had to move to that country and had to interact with native speakers every day, and you certainly wouldn't be able to engage in highly complex discourse with extensive additional study. And heck, you can always continue to learn more and more. For example, it is estimated that there are more than 50,000 Chinese characters (including dialectical variants) although even the most educated native Chinese speaker can rarely recognize more than, say, 25000 of them. Furthermore, even recognizing the characters doesn't mean that you know all of the slang usages and idioms that they can convey.
Engineering is harder because you have to take more courses than other majors, and I don't see the courseload become lighter any time soon. The more advances industry makes, the more you'll have to learn in school.
I don't see how that's the case. If anything, I would argue that the fast pace of innovation should actually make engineering easier
to learn. Why? Because the more that things advance, the most that becomes obsolete, which means that you don't really have to know that old stuff. Sure, it's nice to know it, but you don't really need
to know it.
I'll give you a case in point. I think we can all agree that computer science and especially Internet technologies are arguably the most innovative and fast-moving technologies on Earth. But the fact is, if you're willing to put in the effort, it wouldn't take you that long, i.e. maybe a year, to study some basic books on programming and information technology and develop sufficient skills to get an entry-level computer/IT job. For example, you can read introductory books like 'Visual C++ in 21 Days', and then progress to more advanced books, and through a few months of constant practice, you'll probably good enough to get at least a part-time job as an entry-level programmer. Heck, that's what my brother did one summer: from starting out from knowing nothing, by the end of the summer, he was already writing simple video games. Furthermore, once you become good at one programming language, it's not that hard to learn others, so you can quickly pick up Java, Python, Ruby, and other highly topical skills, and , accompanied with sufficient experience, get a quite high-paying job. Similarly, it's not that hard to pick up skills in Linux, open-source databases (i.e. MySQL), Cisco routers, and Internet server technologies like Apache, AJAX, and so forth. I know several guys who were being offered 6-figure jobs before they had even graduated from high school
because they had extensive knowledge of Web applications. You don't need a college degree to know that.
What makes computer science and Web technologies so easy to learn (but, frankly, also so hard to keep up with) is that, like I said, every technology quickly become obsolete. Hence, while there is clearly a lot that you could
learn, you don't really need to learn most of it, especially the old stuff. For example, while you could learn all about how to write programs to MS-DOS, the question is why would you? Who cares? Nobody uses MS-DOS these days anyway. It's completely obsolete. Similarly, there is little reason to learn the Windows 9x or even the Windows 2000 environment, because, again, practically everybody has moved on. Heck, even Windows XP is soon to be obsolete (as Microsoft will discontinue standard support on XP in 2 years, and so by that time the vast majority of users will have migrated). Similarly, you don't really need to know how to write, say, Java applets, because practically nobody actually does that nowadays, as almost all of today's dynamic Web content is provided via Ajax or Flash (i.e. Youtube).
The point is, computer science and Web technologies are clearly highly innovative and fast-paced, but that also means that, frankly, you don't really need to know how to use any technology version that is more than, say, 2-3 years old in order to get a good job. Again, don't get me wrong, it may be nice
to know some of the older technology. But you don't really need
to know it, because there are plenty of high-paying jobs available even if you only know the new stuff.
As a case in point, I know a bunch of guys who work as computer network designers and administrators who not only don't know how to configure the older routers like the Cisco 2500 series (which was the best-selling router in world history), they've never even seen these routers
. They don't know it, and they don't need to know it. Those routers were top sellers in their day, but they're now completely obsolete, and customers want the new
equipment. Hence, you don't really need to know how the old kit works. All they need to know is how the new
Cisco kit works.
In fact, the quick obsolescence of software skills is precisely what makes the value of experience in that industry so constrained. That's why the software industry, and especially dotcoms, have been repeatedly characterized by age discrimination by which they fire all of their older engineers and replace them with young kids fresh out of college (or even high school), because the harsh truth is that a programmer with 30 years of experience is really not that much better than somebody with just 3 years of experience, simply because that 30 years of experience will mostly consist of knowledge of technologies that are now obsolete. So, sure, that older guy may know a lot more, but he knows a lot more about the wrong things
(or at least, that's what the company believes). A kid who has been developing Web 2.0 technologies for the last couple of years may not know a whole lot relative to the total body of CS knowledge that he could know, but he knows the right things
(again, as seen by the company).
Now, I know what some of you are thinking - that maybe CS and Web services aren't "real" engineering. I and many others would disagree, but, fine, have it your way. Let's take electrical engineering, which I'm sure we can all agree is 'real' engineering, and let's talk about the most innovative subsectors of EE, which is almost certainly computer engineering. How does an ISA bus work? How about RDRAM? How about the Intel Netburst P6-8 processor architecture? Or how a Sun Ultra motherboard works? Or the Sun SSP? Better question - who cares?
Nobody uses that stuff anymore anyways. The fast pace of hardware innovation means that you don't really need to know about obsolete computer technology architectures.
The upshot is that just because a field of study is innovative doesn't necessarily mean that you need to learn more in order to possess competent knowledge. In fact the exact opposite may be true. Foreign languages hardly ever change, but they are also fiendishly complex to become competent in. On the other hand, computer technologies change all the time, yet that also means that there is really only a limited window of recent knowledge that you actually need to know because anything beyond that is obsolete.
As an engineer you are expected to have a working set of skills by the end of your education. And you are expected to PERFORM. You are expected to be able to survive the real world. You are expected to be able to solve a given problem in your field. You are expected to make your company money, and provide consumers with a working product. |
The real world is what makes engineering hard. Other majors are not designed to prepare you for all of the above. They are primarily designed to provide intellectual stimulation.
Yeah, I think what electrifice just said here is far closer to the truth. The other majors, frankly, just have lower expectations. A foreign language bachelor's degree program by itself doesn't really prepare you to communicate competently in that foreign country, nor does ithe program try to do this. For example, I know several people who got bachelor's degrees in foreign languages and then actually tried to live in those countries and found out that all they knew was just "baby Japanese" or "baby French" or "baby Chinese". Sure, they could make themselves understood, but only awkwardly. It actually took them a long time of actually living in those countries and being forced to speak it every day, all day, before they actually felt comfortable (and they were still clearly far from fluent).
But that has also begged the question of me that why don't those other programs simply have higher expectations? Why don't they just assign more work, and then flunk those students who don't do the work? Why not? After all, I thought that colleges were supposed to be teaching its students how to work hard. The engineering programs certainly do that. But shouldn't every
student benefit from learning how to work hard? So why only the engineers?