←back to thread

932 points sohzm | 1 comments | | HN request time: 0.209s | source
Show context
litexlang ◴[] No.44460699[source]
Sorry for your story. In those days open source is REALLY HARD. Put your github link here and we will support your project by starring you and spreading your project. You definitely need to fight back.
replies(2): >>44460743 #>>44460754 #
npsomaratna ◴[] No.44460754[source]
Not the developer, but here is his repo:

https://github.com/sohzm/cheating-daddy

replies(2): >>44460803 #>>44462376 #
dheerajvs ◴[] No.44462376[source]
As an interviewer, I'm seeing a huge increase in proportion of candidates cheating surreptitiously during video interviews. And it's becoming difficult to suspect any wrong-doing unless you're very watchful by looking for academic responses to questions.

Why would anyone encourage building such a tool, I can't fathom.

replies(11): >>44462519 #>>44462688 #>>44462727 #>>44462929 #>>44463152 #>>44463158 #>>44463655 #>>44465775 #>>44465827 #>>44465880 #>>44467024 #
giantg2 ◴[] No.44463152[source]
I won't use it, but I do see it as somewhat symmetric. If the interviewers are using AI or expecting you to use AI for these tasks once you're on the job, then it doesn't seem completely immoral.
replies(1): >>44463368 #
dheerajvs ◴[] No.44463368[source]
That's assuming all interviewers are using AI. And if it's not immoral, why do it surreptitiously?
replies(1): >>44464527 #
1. giantg2 ◴[] No.44464527[source]
Not just interviewers, but tasks at the company. How many companies are not allowing you to use copilot or similar once you're hired?

Morality and restrictions are two different things. Just because someone makes up a rule doesn't mean it's morally enforceable.