Are there any students here who started uni just before LLM's took off and are now finishing their degrees? Have you noticed much change in how your classes are taught?
Are there any students here who started uni just before LLM's took off and are now finishing their degrees? Have you noticed much change in how your classes are taught?
Instead of paper exams asking students "find the bug" or "implement a short function", they get a takehome exam where they have to write tests, integrate their project into a CI pipeline, use version control, and implement a dropbox-like system in Rust, which we expect to have a good deal of functionality and accompanying documentation.
I tell them go ahead and use whatever they want. It's easier than policing their tools. If they can put it together, and it works, and they can explain it back to me, then I'm satisfied. Even if they use ChatGPT it'll take a great deal of work and knowledge to get running.
If ChatGPT suddenly is able to put a project like that together, then I'll ask for even more.
Students today will be practitioners tomorrow, and those that know how to work with AI will be more effective than those who do not.
Using AI is a skill too. People who use it every day quickly realize how poor they are at using it vs the very skilled when they compare themselves. Ever compared your own quality AI art vs the top rated stuff on Civit.AI? Pretty sure your stuff will be garbage, and the community will agree.
When every student can write code that compiles, then you can ask them to write good code. Fast code. Robust code. Measure it, characterize it, compare it.