this post was submitted on 27 Jul 2023
466 points (98.7% liked)
Technology
59235 readers
3429 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There's a similar issue in chess with cheating detection. They use statistical analysis to see if someone's moves are too good. Computers play at a much higher level than humans and you can measure how "accurate" a move is.
It doesn't mean much for a few moves or even 1 or 2 games but with more data you get more confidence that someone is cheating or not cheating.
Chess.com released a rather infamous report last year about a high profile chess player that was cheating on their site. They never directly said "he is cheating" but simply stated "his games triggered our anti-cheating algorithms"
One is debatable, the other is a simple fact. The truth is an absolute defense to defamation. Hans attempted to sue Chess.com for defamation and from what I understand, the case got recently dismissed.
I'd imagine these AI detectors for schools have similar wordings to avoid legal risk. "High probability for AI" instead of saying "AI written". In that case, you may have very little case for defamation.
However, I'm not a lawyer. I'm just guessing these companies that offer this analysis to colleges have lawyers and have spent time shielding the company from legal liability.