←back to thread

645 points ReadCarlBarks | 1 comments | | HN request time: 0.215s | source
Show context
icapybara ◴[] No.44333232[source]
Why wouldn't you want an LLM for a language learning tool? Language is one of things I would trust an LLM completely on. Have you ever seen ChatGPT make an English mistake?
replies(5): >>44333272 #>>44333473 #>>44334660 #>>44335861 #>>44339618 #
healsdata ◴[] No.44333473[source]
Grammarly is all in on AI and recently started recommended splitting "wasn't" and added the contraction to the word it modified. Example: "truly wasn't" becomes "was trulyn't"

https://imgur.com/a/RQZ2wXA

replies(4): >>44333762 #>>44333931 #>>44335746 #>>44337248 #
Destiner ◴[] No.44335746[source]
I don't think an LLM would recommend an edit like that.

Has to be a bug in their rule-based system?

replies(1): >>44338432 #
1. healsdata ◴[] No.44338432[source]
Gemini: "Was trulyn't" is a contraction that follows the rules of forming contractions, but it is not a widely used or accepted form in standard English. It is considered grammatically correct in a technical sense, but it's not common usage and can sound awkward or incorrect to native speakers.