AI cheating is really the tip of the iceberg, here. High school and college educations heavily rely on cookie-cutter assessment programs that aren't just unreliable but also easily cheated on. The cynic in me says that it's the College Board's fault for doubling-down on bubbling tests and SAT scores when today's graduates all cheat their way through them anyways.
When a test for a calculus class asks you to find a particular integral or a test for a music theory class gives you a melody and asks you to add two more voices to produce a three voice fugue following the voice leading and counterpoint rules that would be used by typical Baroque composers the professor is not asking because the calculus professor actually wants to know value of the integral or the music theory professor actually needs to have a fugue written around that melody.
They are asking because they were supposed to have taught you how to evaluate such integrals or apply the rules of Baroque fugue composition, respectively, and they want you to demonstrate that you have learned that.
Doing this requires that you solve the problem, not that you get some tool to do it.
Yes, after school you will use such tools to do most such problems that come up but that's completely irrelevant because at work you are not being asked to solve those problems to demonstrate you know how to solve those problems manually. You are being asked to solve them because you actually need the solution.
Schools should account for LLM use. In fact, schools should have an official LLM or sanctioned 3rd party LLMs that they allow students to use.
Why should we believe any of it?
> There are a number of platforms that, when used effectively, can give instructors a clear view of how students are using available AI tools.
I can't help but think his company's product is one of those.
And that it requires installing spyware on the computers students use for their assignments.
So, it's not really AI that's at fault, as with any new technology, it disrupts; That's what new things do. It's just that the status quo is this lazy approach to education that hamstrings teachers by denying them the resources they need to do the work well, in the name of saving money.
An ironic twist on this "saving money" is that spending more on people, nets more returns. Educate someone well, and they can do high quality work, which makes a lot more money than minimal investment. Spend money on healthcare and prevention, and medical costs overall go down, as large problems are addressed when they're still small.
But I guess the lure of "lower taxes" is enough for this sort of thing to persist.
If that is the case, then the employers dependent on that workforce need to support it.
Perhaps AI is teaching industry that they can no longer depend on a free lunch.
Any teacher who is letting students cheat with ChatGPT is lazy.
Source: I teach for a living.
My Students are allowed to do theyr homework with ai. But it's not theoretical assignments, but doing projects. I assume that being able to use AI in the future work is a important part, so they should use it early and experiment with it during study. Just because calculators got invented math didnt just vanish, they had to adapt what to teach and how to do assignments.
The ones who struggle the most are the ones that do the same assignments since 20 years and now are confronted with a new tool that makes theyr tests obsolete.