Hello! I would like to start out by introducing myself, I'm Ryan! I'm a current Computer Science major at Seton Hill University.
Now, to my question. For my Calculus 1 class I was told to write a paper on how we would use Derivatives and Calculus in general for our major. This is where I am stuck... I know that I need a lot of math (Calc 1&2, Discreet, Linear, Stats), but I am not sure how calculus will be practical. I've heard that it can be helpful in programming, but I have no interest in programming for the rest of my life. I'm hopefully starting my networking and internet security classes next semester, because those areas interest me the most. I don't see how I would use Calc in Security or I.T. I've tried looking at many sources, and haven't had time to make an appointment with my advisor, so I'm kind of stuck at the moment. If anybody has some background or information that would great. Thanks!
I was told to write a paper on how we would use Derivatives and Calculus in general for our major.
Machine learning and graphics are quite heavy on calculus and calculus concepts. And I bet that the physics that goes into making a computer work uses calculus too.
I had an assignment just like this in high school, except we were able to pick what subject to write about and calculus's applications for it. So when I wrote about comp sci, I talked about how you'd use l'hopitals rule to find the time complexity of different algorithms and big o notation and all that. Then I also talked about taking the limits of different sorting methods as they went to infinity, or something like that (I was bsing a little at that point because I couldn't find much info about calculus with comp sci). Most of my paper was bsing actually, since all I found that was useful was a sentence about l'hopitals rule here: DS - Algorithm Analysis
Also, the is a course that is part of BOTH computer science and math departments called numerical analysis.
That's the opposite of an application of calculus in computer science: you are using computational methods to solve (or approximate) calculus problems.
aegrisomniaPosts: 1,025Registered UserSenior Member
That's the opposite of an application of calculus in computer science: you are using computational methods to solve (or approximate) calculus problems.
I'm not sure what you suggested is really much better:
Machine learning and graphics are quite heavy on calculus and calculus concepts.
These are applications of computer science, at least depending on how you define computer science. The only reason that these are "heavy on calculus" is because the application area is heavy on calculus; not all areas of CS application share these properties, so I would find these as invalid as numerical analysis.
And I bet that the physics that goes into making a computer work uses calculus too.
This is a pure flight of fancy, right? The physics of computers has nothing to do with computer science as most people understand it. Certainly there are aspects of hardware physics which are taken into account in various application domains (temperature/power-aware scheduling come to mind as great examples), but these are also applications in the sense that the usefulness of calculus is incidental, not fundamental, to computer science.
I think the real issue here is this: does Computer Science include application areas (machine learning, graphics, numerical methods, etc.)? If not, what does computer science include? Note that evaluating series and sequences is something which is often taught in introductory calculus sequences, and this finds direct and uneqivocal application in the analysis of algorithms. In fact, some of the most clearly unambiguous examples of the potential for calculus in Computer Science proper seem to be in analysis of algorithms, and in particular, algorithmic complexity: evaluating series and sequences in limits. You could imagine scenarios where derivatives, integrals and limits would all be useful (since all can be useful in working with series and sequences).
Unless clear bounds are put onto what belongs to CS and what belongs to application areas, I'm not sure how much more there is than that.
These are applications of computer science, at least depending on how you define computer science. The only reason that these are "heavy on calculus" is because the application area is heavy on calculus; not all areas of CS application share these properties, so I would find these as invalid as numerical analysis.
I am not sure what you are getting at.
I do think that there's a difference between "using calculus to solve problems in computer science" and "using computer science to solve problems in calculus." Numerical analysis comes up with algorithms to solve calculus problems, so it's strictly in the second camp. Computer graphics does the opposite: you have a problem in computer science (how to represent in memory and render on a 2-d screen an image of a 3-d scenery) and you are using calculus concepts to solve it.
Are you now appealing to transitivity? You might have a point there. HOWEVER, even then, Numerical Analysis is only useful in CS because there are branches of computer science whose solutions rely on calculus. ln other words, without an actual use of calculus in another branch of CS (like graphics or machine learning), numerical analysis would be fundamentally useless for computer scientists.
Outside of the numerical analysis example I gave, the first clue was to Google on "calculus" and "cryptology". There will be results that associate the two. The reason I mention those two terms is because the OP stated an interest in computer security...which cryptology directly relates to.
Replies to: Calculus in Computer Science
+cryptology +calculus
or even
+cryptology +derivative
Also, the is a course that is part of BOTH computer science and math departments called numerical analysis.
You're Welcome
These are applications of computer science, at least depending on how you define computer science. The only reason that these are "heavy on calculus" is because the application area is heavy on calculus; not all areas of CS application share these properties, so I would find these as invalid as numerical analysis.
This is a pure flight of fancy, right? The physics of computers has nothing to do with computer science as most people understand it. Certainly there are aspects of hardware physics which are taken into account in various application domains (temperature/power-aware scheduling come to mind as great examples), but these are also applications in the sense that the usefulness of calculus is incidental, not fundamental, to computer science.
I think the real issue here is this: does Computer Science include application areas (machine learning, graphics, numerical methods, etc.)? If not, what does computer science include? Note that evaluating series and sequences is something which is often taught in introductory calculus sequences, and this finds direct and uneqivocal application in the analysis of algorithms. In fact, some of the most clearly unambiguous examples of the potential for calculus in Computer Science proper seem to be in analysis of algorithms, and in particular, algorithmic complexity: evaluating series and sequences in limits. You could imagine scenarios where derivatives, integrals and limits would all be useful (since all can be useful in working with series and sequences).
Unless clear bounds are put onto what belongs to CS and what belongs to application areas, I'm not sure how much more there is than that.
I do think that there's a difference between "using calculus to solve problems in computer science" and "using computer science to solve problems in calculus." Numerical analysis comes up with algorithms to solve calculus problems, so it's strictly in the second camp. Computer graphics does the opposite: you have a problem in computer science (how to represent in memory and render on a 2-d screen an image of a 3-d scenery) and you are using calculus concepts to solve it.
Are you now appealing to transitivity? You might have a point there. HOWEVER, even then, Numerical Analysis is only useful in CS because there are branches of computer science whose solutions rely on calculus. ln other words, without an actual use of calculus in another branch of CS (like graphics or machine learning), numerical analysis would be fundamentally useless for computer scientists.