Grammarly: A not-bad grammar checking tool (30/8/2013)

Grammarly: A grammar checking tool

So what follows here is slightly unusual fare for this blog but it’s writing related and turned out to be a bit more of an interesting experiment than I initially thought. So . . . a few months back I was invited to play with a grammar checking tool and, for some reason I still don’t understand, imagined this would come complete with an extra day tucked into the week somewhere in which to play with it. Still, it was an interesting exercise in the end.

The tool is called Grammarly and, when I used it, it operated as a web application. My personal previous experience with grammar checkers is limited to the grammar checked in MS Word which I loathe with abundant passion. As a writer of fiction, I think I come to any grammar checker with a deep sense of suspicion. You see, it’s not my job or my aim to write grammatically correct prose; it’s my aim to write prose that flows and this frequently results in deliberately breaking grammar “rules” even in descriptive passages. As for dialogue . . . Well, people don’t talk grammatically, they just don’t.

The upshot is that I have two main criteria in assessing the utility of any tool like this. The first is the criteria I expect the designers aim to fulfil: how well does the tool work in identifying and explaining grammatical errors. The second is one I don’t see how any designer could possibly address: how much of my time does the tool waste in correctly pointing out grammatical errors which were intentional in the first place and so I don’t want to change. This second criteria is one at which I expect every single grammar tool ever made, now or in the future, to fail, simply because the number of deliberate “mistakes” in a work of fiction will be so high that reviewing them all will become boundlessly annoying.

First things first – a few generic irritations to get off my chest: I can understand why there’s a limit to the size of the document that can be uploaded to a web application and I suppose that for most purposes the limit (I can’t remember exactly what it is but I think 10k words) is fairly generous, but I could see it getting quite irritating loading up and editing an entire novel in chunks. In part because it’s just another irksome chore but mostly because I suspect it undermines the potentially rather useful “ignore all” feature (of which more in a moment). It’s also a bit irritating having the application doing spell-checking when I’ve already done that and now have to go through clicking “ignore all” lots of times, presumably only to have to do the same again when I load up the next chunk with exactly the same set of character names and places.

Grammarly splits all the faults it finds into a plethora of sub-categories and has an ‘ignore all’ option for each one individually. It wasn’t clear to me exactly how this actually works – I initially took it to mean all grammar faults of a particular type would be ignored in the text (which would have been useful) but this didn’t seem to be the case. It became clear to me when I used the tool later that I’d like to switch various parts of the grammar checking in and out, tailoring the use of the tool to my personal strengths and weaknesses. I thought the ignore-all options would allow this but they didn’t seem to work that way.

Something I wasn’t able to test but which might mitigate or even completely eliminate these two irritations is the tool’s integration with MS Office. Grammarly offers the option to download the tool as a plug-in (I think). Presumably this would then allow entire novel-length documents to be examined in one go while seamlessly integrating with the Office dictionaries. Presumably. Unfortunately, as I don’t use MS Office, I wasn’t able to test this. Having to cut out chapters, work on them in a separate tool and then paste them back makes the tool a non-starter for me, and that’s a real pity because the reports the tool made on my two sample pieces I found to be impressive.

On to the detail then: For the review I used two test pieces of prose. Sample one was two chapters (3600 words) of The Crimson Shield. This was text that has been (allegedly) written and rewritten to perfection by me, then edited, rewritten again, copy-edited and proof-read, so it really ought to be quite squeaky-clean. The second sample was a single chapter of 6160 words from a work in progress that I think is about ready for submission to my editor.

For the proof-read sample, the tool split identified a good few categories of faults. In each case, I’ve noted the type of fault, the number found and the number I felt merited a change to the prose:

  • Use of Articles (a test for the presence of an unnecessary definite article) [2/0]

  • Pronoun Agreement (a test to see whether a pronoun has the correct form (i.e. singular plural and subject/object) for the noun it replaces) [6/1]

  • Use of Adjectives and Adverbs [4/1]

  • Incomplete comparisons [3/0]

  • Use of “Like” and “as” [1/0]

  • Faulty Parallelism (i.e. in a sentence with multiple clauses, the verbs either side of the co-ordinating conjunction should have the same tenses. I had a debate with an editor about this a couple of months back) [3/2]

  • Squinting modifiers (when a modifier in a sentence with multiple clauses is not unambiguously associated with a specific one of the clauses) [6/1]

  • Mistakes using qualifiers and quantifiers [0/0]

  • Split infinitives [1/1]

  • Subject and Verb Agreement [8/0]

  • Apparent missing verbs [4/0]

  • Verb Form Use (wrong form of a verb) [2/0]

  • Possible missing words [0/0]

  • Punctuation: commas – this particular MS WENT through a great comma cull and the tool and I disagree on the appropriate use of commas for run-on sentences and before a conjunction joining independent clauses. I’m far from sure I’m right on this one. The tool made 35 suggestions of which 32 were regarding commas. I would have implemented nine of them and most of the others I think my editor would have implemented We don’t see eye to eye on commas. The tool apparently doesn’t understand ellipses . . .

  • Spelling: The tool found 135 spelling mistakes all of which were names etc., The tool has its own dictionary and it only took about 30 seconds to go through and add them all.

  • Commonly Confused Words [2/0]

  • Capitalisation [0/0]

  • Vague and over-used words [0/0]

It’s worth noting that a good few of the apparent problems (perhaps 30%) occurred in dialogue where the tool was clearly correct in identifying a grammatical fault but the fault lay within the pattern of speech for a particular character and thus didn’t merit change. In two flagged sentences, although I disagreed with the change proposed by the tool, I would have made a related change.

Overall, for this “polished” piece of prose, it took me about twenty-five minutes to upload, run the tool and review the results. Ignoring spelling and punctuation, the tool flagged forty-one possible problems of which eight would have resulted in a change to the manuscript if it hadn’t already been too late. It flagged thirty-five punctuation problems of which I would have implemented nine changes. The spell-checking was superfluous. Expanded to an entire novel, this equates to about ten hours of work to catch some 200-250 sentences that could have been more clearly written, i.e. close to one per page (I’ll ignore the punctuation and spelling). This strikes me as quite a lot for a finished manuscript.

For the “submission-ready” sample, the results were slightly different. As a general note, I found that the sentences highlighted by the tool in this sample frequently merited some examination and re-wording even if the specific problem highlighted by the tool wasn’t one with which I felt required changing.

  • Use of Articles (a test for the presence of an unnecessary definite article) [5/1]

  • Pronoun Agreement (a test to see whether a pronoun has the correct form (i.e. singular plural and subject/object) for the noun it replaces) [11/3]

  • Use of Adjectives and Adverbs [7/3]

  • Incomplete comparisons [7/2]

  • Use of “Like” and “as” [1/0]

  • Faulty Parallelism (i.e. in a sentence with multiple clauses, the verbs either side of the co-ordinating conjunction should have the same tenses. I had a debate with an editor about this a couple of months back) [2/1]

  • Squinting modifiers (when a modifier in a sentence with multiple clauses is not unambiguously associated with a specific one of the clauses) [1/0]

  • Mistakes using qualifiers and quantifiers [1/0]

  • Split infinitives [0/0]

  • Subject and Verb Agreement [1/0] (the tool mistook a proper noun for a plural)

  • Apparent missing verbs [7/2]

  • Verb Form Use (wrong form of a verb) [9/2]

  • Possible missing words [2/2]

  • Punctuation: In the unpolished sample, the tool raised 125 queries. The issues were much the same as above.

  • Spelling: there were correct English spellings being flagged as incorrect and no apparent way to change the language of the dictionary.

  • Commonly Confused Words [16/0]

  • Capitalisation [4/0]

  • Vague and over-used words [10/9]

For this piece it took about fifty minutes to go through the whole process. Ignoring spelling and punctuation again, the tool flagged seventy-five possible problems of which twenty-three seemed require a change to the MS and a further nine resulted in changes in the highlighted sentence due to related problems. Expanded to an entire novel, this equates to about fourteen hours of work to catch some 500-550 sentences that could have been more clearly written (I’ll ignore the punctuation and spelling).

In both samples, I found the tool clear and easy to use and its explanatory text as to why it was proposing a change was lucid and sensible. On numerous occasions, I found sentences where the highlighted ‘fault’ wasn’t one with which I agreed but there was some clumsiness in the sentence construction that deserved to be addressed and had caused the fault to be highlighted.

Crunch question – will I use it? As things stand, no, because having to chunk up work and feed it piecemeal into a web-based tool is really irritating and prone to introduce mistakes. If I used MS Office and if the integration is truly seamless, I might think otherwise; even with the support of a professional editorial team, the number of faults I would have changed in the supposedly polished sample was, I thought, high. Although the number of faults that didn’t merit any change was high too, the tool was clear and easy to use, the explanations given were lucid and yet detailed and it was almost always quick and easy to make a choice on the proposed change and move on. I’d probably wrap the use of the tool into the copy-editing stage of manuscript production. It’s probably also a useful tool for identifying and perhaps rectifying any systematic flaws in a writer’s style. I could see a few patterns starting to emerge even from these two samples.

One last minor irritation: the tool speaks fairly well for itself when you use it in its full version, but although as a non-subscriber you can put some sample text in and have it run a report, it doesn’t sell itself very well (it tells you there are a pile of problems but doesn’t show you what they are and feels a bit like a virus checker). It’s understandable that Grammarly don’t want people freeloading off their hard work but I do wonder whether giving free access to the web-based tool with a maximum text sample size of 500 or 1000 words would show the tool off much more effectively.

Disclosure: This review was presented to the suppliers of Grammarly for comment in case I’ve mis-represented their tool. They didn’t ask for any changes or clarifications.

Tags:

Leave a Reply