Normally I just delete these as spam, but this summer program for graduate students 1) looks interesting, and 2) has some scholarship money available.
Entries Tagged "schools"
Page 3 of 8
I have had mixed feelings about this since I was asked early this year. The best piece of advice I’ve read is: “It’s a great honor, but it is an honor, not a degree.”
EDITED TO ADD (12/14): It was a Doctor of Science, not a Doctor of Philosophy. I changed it above.
As you’d expect, it’s not very good:
But this measure [Turnitin] captures only the most flagrant form of plagiarism, where passages are copied from one document and pasted unchanged into another. Just as shoplifters slip the goods they steal under coats or into pocketbooks, most plagiarists tinker with the passages they copy before claiming them as their own. In other words, they cloak their thefts by scrambling the passages and right-clicking on words to find synonyms. This isn’t writing; it is copying, cloaking and pasting; and it’s plagiarism.
Kerry Segrave is a right-clicker, changing “cellar of store” to “basement of shop.” Similarly, he changes goods to items, articles to goods, accomplice to confederate, neighborhood to area, and women to females. He is also a scrambler, changing “accidentally fallen” to “fallen accidentally;” “only with” to “with only;” and, “Leon and Klein,” to “Klein and Leon.” And, he scrambles phrases within sentences; in other words, the phases of his sentences are sometimes scrambled.
Turnitin offers another product called WriteCheck that allows students to “check [their] work against the same database as Turnitin.” I signed up and submitted the early pages of Shoplifting. WriteCheck matched many of Shoplifting’s phrases to those of the i>New York Times articles in its library of student papers. Remember, I submitted them as a student paper to help Turnitin find them; now WriteCheck has them too! WriteCheck warned me that “a significant amount of this paper is unoriginal” and advised me to revise it. After a few hours of right-clicking and scrambling, I resubmitted it and WriteCheck said it was okay, being cleansed of easily recognizable plagiarism.
Turnitin is playing both sides of the fence, helping instructors identify plagiarists while helping plagiarists avoid detection. It is akin to selling security systems to stores while allowing shoplifters to test whether putting tagged goods into bags lined with aluminum thwart the detectors.
You can now get a Master of Science in Strategic Studies in Weapons of Mass Destruction. Well, maybe you can’t:
“It’s not going to be open enrollment (or) traditional students,” Giever said. “You worry about whether you might be teaching the wrong person this stuff.”
At first, the FBI will select students from within its ranks, though Giever wants to open it to other law enforcement agencies. Rather than traditional tuition, agencies will contract with the school, paying about $300,000 a year for groups of 15 to 20 full-time students, according to documents submitted to the board of governors of the State System of Higher Education.
In Applied Cryptography, I wrote about the “Chess Grandmaster Problem,” a man-in-the-middle attack. Basically, Alice plays chess remotely with two grandmasters. She plays Grandmaster 1 as white and Grandmaster 2 as black. After the standard opening of 1. e4, she just replays the moves from one game to the other, and convinces both of them that she’s a grandmaster in the process.
Detecting these sorts of man-in-the-middle attacks is difficult, and involves things like synchronous clocks, complex cryptographic protocols, or—more practically—proctors. Proctors, of course, can be fooled. Here’s a real-world attempt of this type of attack on the MCAT medical-school admissions test.
Police allege he used a pinhole camera and wireless technology to transmit images of the questions on a computer screen back to his co-conspirator, Ruben, at the University of British Columbia.
Investigators believe Ruben then tricked three other students, who thought they were taking a multiple choice test for a job to be an MCAT tutor, into answering the questions.
The answers were then transmitted back by phone to Rezazadeh-Azar, as he continued on with the test in Victoria, police allege.
And as long as we’re on the topic, we can think about all the ways to hack this system of remote exam proctoring via webcam.
One of the things I am writing about in my new book is how security equilibriums change. They often change because of technology, but they sometimes change because of incentives.
An interesting example of this is the recent scandal in the Washington, DC, public school system over teachers changing their students’ test answers.
In the U.S., under the No Child Left Behind Act, students have to pass certain tests; otherwise, schools are penalized. In the District of Columbia, things went further. Michelle Rhee, chancellor of the public school system from 2007 to 2010, offered teachers $8,000 bonuses—and threatened them with termination—for improving test scores. Scores did increase significantly during the period, and the schools were held up as examples of how incentives affect teaching behavior.
It turns out that a lot of those score increases were faked. In addition to teaching students, teachers cheated on their students’ tests by changing wrong answers to correct ones. That’s how the cheating was discovered; researchers looked at the actual test papers and found more erasures than usual, and many more erasures from wrong answers to correct ones than could be explained by anything other than deliberate manipulation.
Teachers were always able to manipulate their students’ test answers, but before, there wasn’t much incentive to do so. With Rhee’s changes, there was a much greater incentive to cheat.
The point is that whatever security measures were in place to prevent teacher cheating before the financial incentives and threats of firing wasn’t sufficient to prevent teacher cheating afterwards. Because Rhee significantly increased the costs of cooperation (by threatening to fire teachers of poorly performing students) and increased the benefits of defection ($8,000), she created a security risk. And she should have increased security measures to restore balance to those incentives.
This is not isolated to DC. It has happened elsewhere as well.
There are several services that do automatic plagiarism detection—basically, comparing phrases from the paper with general writings on the Internet and even caches of previously written papers—but detecting this kind of custom plagiarism work is much harder.
I can think of three ways to deal with this:
- Require all writing to be done in person, and proctored. Obviously this won’t work for larger pieces of writing like theses.
- Semantic analysis in an attempt to fingerprint writing styles. It’s by no means perfect, but it is possible to detect if a piece of writing looks nothing like a student’s normal writing style.
- In-person quizzes on the writing. If a professor sits down with the student and asks detailed questions about the writing, he can pretty quickly determine if the student understand what he claims to have written.
The real issue is proof. Most colleges and universities are unwilling to pursue this without solid proof—the lawsuit risk is just too great—and in these cases the only real proof is self-incrimination.
Fundamentally, this is a problem of misplaced economic incentives. As long as the academic credential is worth more to a student than the knowledge gained in getting that credential, there will be an incentive to cheat.
Related note: anyone remember my personal experience with plagiarism from 2005?
Sidebar photo of Bruce Schneier by Joe MacInnis.