Automated Essay Scoring Myths: Part 2

Automated Scoring Myth #2:  Jobs Will Be Lost

This is our second article in this series, so please click here if you missed Part 1.  Note that I will be frequently using the abbreviation AES to refer to Automated Scoring / AI Scoring / Automated Essay Scoring.  Let's begin...

After making the case for the accuracy of Automated Essay Scoring (AES) systems in Part 1, it may seem that the natural consequence of AES would be the loss of jobs.  In particular, teachers and graders (called "readers") may feel vulnerable, as is evidenced by the petition created by a group of readers in 2013.  Each of these roles is worth examining separately since they are quite different in purpose and activities.

Teachers
In Part 1, we discussed the role of AES in high stakes testing, but AES is also an important trend in the classroom. When I tell people about my work on AES, the response that I get is sometimes to the effect of, "So, you are creating technology to replace teachers." This couldn't be further from the truth.  If 10% of a teacher's time is spent grading essays, would this enable us to have 10% fewer teachers?  It's not illogical to jump to that conclusion, but the math doesn't add up when it comes to teaching students how to become excellent writers. The reality is that writing is a craft that takes practice and feedback, just like any other task.  But the time that is required to grade papers causes writing instructors to offer fewer writing assignments with less feedback than is optimal. Enter AES -- a valuable tool that empowers teachers to give more writing assignments and similarly allows students to receive more feedback. AES does not replace the teacher, it's just another tool that the teacher can use. In fact, it may be the best tool!  Some teachers have told us that they mandate usage of PaperRater by their students before the teacher even sets eyes on each paper. PaperRater takes care of checking grammar, spelling, word choice, and more, which frees the teacher up to help each student express themselves with clarity and develop their own distinct flair.

Readers (a.k.a. Graders)
The issue of jobs in regards to readers employed by testing institutions is a bit more opaque.  It is true that when a computer scores a response, then that is one less response that will be scored by a human reader.  But that is not the whole story.  AES systems must be "trained" for each prompt that they are expected to grade, and this process requires that human readers score a number of responses (perhaps 600-2000).  The computer then uses this training set to build a model that it can use to score future responses.  This means that human readers are inextricably tied to the AES technology for each and every prompt.  Because of the expense associated with human readers, writing assignments have been excluded from most standardized tests that students take each year.  But, thanks to AES, this may be changing.  Large groups of school systems in the U.S. and abroad are evaluating AES technology and vendors with the intention of incorporating written assessments (short answer and essay) into standardized testing in a wide variety of subjects from Biology to English Composition.  If successful, this will represent incredible demand for the scoring of written responses by both humans and computers.  Essentially, AES would be "growing the pie", rather than just taking pieces of the pie away from human readers.  So, it's my belief that AES will result in more jobs for human readers, rather than less jobs.  However, I do concede that the future is much less clear in this area.

About PaperRater

As in Part 1, I am including a shameless plug for our free Automated Essay Scoring tool. Students and teachers appreciate the immediate feedback that they receive from PaperRater. You will not find another free tool that offers so many benefits including grammar check, spelling check, analysis of word choice, automated scoring, and plagiarism detection.  We hope you will give it a try!