Our plagiarism detection currently does a pretty good job, but we recognize that users would like to see what page(s) on the Internet matched their document. This has been added to our "To Do" list.
Some of our critics have stated that our automated proofreading service is useful, but doesn't wow them enough to keep coming back. Please consider that what you see right now is only the tip of the iceberg in terms of where we plan on taking this service. If grammar checking, spelling correction, word choice analysis, and plagiarism detection does not do it for you, then we feel confident that the features on the horizon will. Check back soon...
I read about your site today and tried it out. I'm sorry to say that I wasn't impressed. The item I submitted had an obvious dangling participle that was not flagged--and that was the test I'd hoped you would pass.ReplyDelete
My criticism isn't meant to be negative but rather to alert you to a spot where improvement is needed. Unfortunately, I don't see any location for doing so other than commenting on a post...my intention is to be helpful. Maybe some will read this one day.
Your rater ignored "These" but questioned "tales." which should have been identical to the unquestioned "Tales" of the title.ReplyDelete
Your rater said sophisticated vocabulary was only 83% of "average," but challenged terms as "ensure," "the fact that," even "conclude" [=logically decide]as "complex" [and suggested "end" "finish" for the last one].
It also seemed to ignore quotes when novel terminology was used... I suppose legitimate to question, but wonder at suggestions way out of context.
Another issue I encountered is the user unfriendly way my Firefox has started behaving with all web pages (lost my "back/forward" tabs and do not know how to close Paper Rater)ReplyDelete
A standard tool or edit bar right on the web-page would work anytime.
Thanks for the feedback. I know this response is somewhat late, but may be useful for people that find this article in the future...ReplyDelete
@Michael - I do appreciate any constructive criticism. We are improving grammar detection capabilities right now during development and hope to release these improvements soon. However, in response to your comment, I'm not certain that dangling participles are the type of error that should be used to judge the effectiveness of our algorithms. Typically, evaluating whether a participle is dangling requires deep semantic understanding of the antecedent -- something which computers do not yet have.
@AEF's GURU - Thanks! We have made several changes since your post that will eliminate some of the false positives you experienced.
@oasic: I'm no longer a programmer, but over two decades ago, we worked on ways to diagram sentences by computer. The problem was not writing code that would analyze a sentence based upon strict grammatical rules; the problem was lack of memory and storage. Now that those problems are long gone, I would think today's clever programmers would have no difficulty. I'm not trying to be critical so much as encouraging you to be more creative.ReplyDelete
Do schools still teach students how to diagram sentences?
@Michael - Please do not feel that you are being too critical; we appreciate all the feedback. We are currently developing with some of the tools that you mentioned -- ones that parse the sentence into a tree based structure. Because language is so dynamic and a word's grammatical role in a sentence often depends on its meaning, this approach is hardly fool proof. Computers can parse a sentence into multiple trees and choose the most probable. Often, no suitable tree (i.e., diagram) can be generated, which sometimes indicates a grammatical error in the sentence. However, identifying the specific error is more challenging. Finally, there's the issue of false positives. If we find 100% of the errors, but also include a large percentage of false positives, then people will not be happy. So, it's a balancing act and there is still much to be done.ReplyDelete