Comments

David Rudling September 4, 2020 7:51 AM

In Europe, Article 22 of the General Data Protection Regulations (GDPR) provides for at least some algorithmic transparency. I am no lawyer but I think the rather obviously fully automated nature of the so-called AI algorithm quoted here would mean that a european student could demand to see the algorithm to which (s)he was subject. Such transparency would surely put a swift end to this sort of rubbish software and hopefully the companies peddling it.

Winter September 4, 2020 8:47 AM

@David Rudling
“a european student could demand to see the algorithm to which (s)he was subject.”

I am not sure it will go that far, but a human must sign off of the grade AND the student can ask for recourse. It is true that grading must be transparent, and current AI seldom is.

MikeA September 4, 2020 10:48 AM

The AI seems to have been correctly imitating the usual bored grad-student T.A. getting paid minimum wage to do that task.

Meanwhile, on the “algorithmic transparency front, I remember when U.C. Berkeley Student Association went to a ranked-choice voting procedure (1970 or so), and a group of C.S. students asked to see the source code of the program that would be tabulating the votes. I assume it will surprise nobody here that the request was denied, because (from memory, so paraphrased) “We don’t want to help hackers game the system”.

Serg September 4, 2020 12:57 PM

In theory, in Europe you should be able to know the algorithm used. In practice, sometimes the algorithm is hidden even in a sensitive issue like electric sector subsidies for people at poverty risk in Spain:

https://civio.es/novedades/2019/07/12/being-ruled-through-secret-source-code-or-algorithms-should-never-be-allowed-in-a-social-and-democratic-state-under-the-rule-of-law/

It has been taken to the courts and the judge has ruled it should be public, but they are using intellectual property and even national security as arguments to deny access (and will take it to higher courts)…

Steven September 4, 2020 6:59 PM

Security issues aside, automated grading is horrid.
One of my kids had something like this: not for History, but for physics.
The teacher couldn’t be bothered to assign and grade proper homework.
Instead, he fobbed the kids off onto a web app.
– go to the site
– get a problem
– solve the problem
– type in the numerical answer
– right answer? go on to the next problem
– wrong answer? try again
The web app allowed maybe 0.5% margin for rounding error, and you got 5 tries before it failed you on that problem.

It sounds reasonable in the abstract, but in practice it was utterly wretched.
All learning is, at some level, an interaction–a conversation–between student and teacher.
Even if it is nothing more than a red check mark or a red X on a homework paper,
you have communicated some thing to some person and gotten some response.
You don’t realize how important this is until it is gone.

With nothing but a machine to talk to, it stops being about learning.
It is just about satisfying the machine by whatever means necessary.
In his rage and frustration my son told me that the easiest way to solve the problems was to copy and paste the problem text into google. This would reliably return the general formula for solving that problem; plugging in the numbers that the web app had generated for your instance of the problem would then yield the correct answer.
By the end of the school year, I was telling him that if he didn’t want to deal with the web app, he should use google to get his grade, and if he wanted to learn physics, I would teach it to him.

Automated essay grading is even worse.
There is no point writing prose unless a human is going to read it.
When I want to talk to machines, I write code.

Writing songs, that voices never shared…
— Paul Simon

Singular Nodals September 5, 2020 5:38 PM

Hopefully these AI systems will soon be conducting certification too.

https:// dilbert.com/strip/2000-08-31

Thunderbird September 9, 2020 11:26 AM

Hmm. Wonder if my comment is being moderated or there was a discontinuity?

At any rate, comment spam above from “jon”.

Antistone September 9, 2020 2:09 PM

I think the article is overusing the word “cheating”. Tailoring your answer to the grading system is not cheating, IMO. If you mention puppies in your answer because you know your teacher likes puppies, and the teacher gives you bonus points for that, then it is possible that the teacher is “cheating”, but the student is not. Similarly, if the student submits word salad because they know it gets full marks, then the grader might be “cheating”, but the student is not.

(Looking up answers online is probably cheating, depending on the rules of the test. Though I’d argue that it’s kind of irresponsible for a teacher to use unmonitored remote tests where looking stuff up online is both significantly helpful and forbidden, because that’s obviously going to result in cheaters getting an advantage with a very low probability of being caught.)

I also think if a student submits and answer that ought to be judged highly under your ostensible or implied grading standards, but is judged poorly under your actual (secret) grading standards, then that student has a legitimate grievance.

Furthermore, I think it’s crazy to suppose you could use a grading system like this on a large scale and not have any students figure it out.

JoeHx September 10, 2020 3:13 PM

Their cheating reminds me of some of the early (blackhat?) SEO attempts. Keyword stuff your articles to get to the top page of Google (or Altavista, if you’re older than twenty-two).

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.