Our son leads a development team and doesn’t want anyone wasting time coding what can be produced effortlessly. I posted previously that his developers are now freed from mundane coding tasks and can focus on solving problems faster and more efficiently. They can review Chat code and, for anything not quite on target, feed the alternative in to ensure that that particular interpretation does not recur. This type of iterative learning improves both Chat and the user which is how AI machine learning is supposed to work.
Currently, he is attending the three-month captain’s course to earn his next rank. His team, with the approval of the instructor, is using ChatGPT4 to complete all of their homework. They feed in the exact problems (strategic battlefield, resource allocation, communications, and logistics issues, etc.) to see what the program comes up with. They have been stunned by the acceptability of the output, often requiring little modification. Army Cyber is very interested/involved in AI capability and is developing and experimenting with it in many ways. What AI can do, let it do, and use manpower to teach it to do better.
The conundrum is that in order to improve AI, people need to have the skills to evaluate what it produces and modify as necessary, kind of a chicken-and-egg dilemma. So it is still very important for students to be able to do original work and hone their problem solving skills to avoid using AI to create cement lifejackets.
How schools harness this beast will be interesting to watch. I don’t believe policies or honor codes to define or prevent plagiarism are the way to go. I think we need to embrace the power and accessibility of AI tools and teach students how to use them to best advantage. And, perhaps, altogether change the way we teach and what we expect students to learn.