
A digital eye on your text
I’ve reblogged Louise Harnby’s “10 Ways to Proofread Your Own Writing” from Chris the Story Reading Ape’s blog. Harnby’s post is full of free tools for catching slips in your final copy. I decided to try out one of them, “The Bookalyser” on the completed ms. of my as-yet-unpublished Surfing the Bones, a 98,000-word mystery.
STB had gone through an extensive edit, not least because an online critique process had left it much richer emotionally but far too long. Even though I’m currently responding to a beta read by updating some of the technology driving the plot and making minor setting changes, I considered the draft a good example of my own editing process. So I was curious to see what an editing app could tell me. What did I miss?
I have an advantage because I’m a grammar nerd capable of catching non-standard verb forms and recognizing passive-voice constructions. I can also form plural possessives, an apparently challenging task.
So standard “grammar-checkers” don’t help me much; they usually just object to my deliberate sentence fragments or my decision to start a sentence with “But.” I wanted to see if The Bookalyser offered more.
Like many programs for writers, the BA has a free version and two levels of paid versions. I used the free one. The site says out front that the tool won’t help you with style and usage questions; Word, it says, can do that. Instead, this tool provides a numerical/statistical portrait of certain features of your ms.
As advertised, if you register with email and password, it will run through your full manuscript in seconds and provide a full printout of its findings.
Rather than describe the “more than 70 different tests (and growing) across 17 report areas,” I’ll discuss what I found most useful.
I learned that
- I use the word “maybe” 166 times, which is 10 times more than usual for fiction. Worth a search to see if I can cut some of those. Still, 166 times in 98,000 words isn’t cause for panic, I am relieved to say.
- Less than 1% of my text consists of the dreaded “-ly” adjectives, and only two appeared more often than expected. The app did call “belly” an “-ly” adverb, but I guess that can be forgiven in such a complex app.
- “Filler words” like “actually,” “fairly,” “just,” and “really” made up 0.59% of my text, as compared to 0.65% for fiction in general. Still, worth doing a search to see whether these are needed.
- I used “said” as a dialogue tag 207 times and some other tag 41 times, with only 7 of these tags used more than once. I report proudly that I used a dialogue tag with an “-ly” adverb only 8 (!!!) times in my 98,000-word text.
- The app did look for “passive” constructions, which it defined broadly, with “is dead,” “was afraid,” and “be afraid” alongside true PV forms like “was followed” or “been killed.” In other words, predicate adjectives counted in this category. Even so, the app said that only 2.5% of my sentences fell into its “passive” categories. Hooray.
- The app compared phrases that I had hyphenated with instances of the same phrase that I did not hyphenate. I’m pretty good on hyphens, but this choice is well worth a search.
- It also encouraged me to look at spelling inconsistencies like “check out” vs. “checkout” and “web site” vs. “website.” Quick checks should allow me to decide on a preferred form.
Suggestions for eliminating possible redundancies were less helpful. I looked at a number of these and will look at them all, but found that the shorter version often sounded less natural, especially in dialogue. These are judgment calls often resulting in a savings of one word. While in my aggressive edit to eliminate 7000 words, every word did count, the trade-off (hmmm, hyphen?) was problematic. Example: “He didn’t admit to a crime” vs. “He didn’t admit a crime.” I’ll stick with the former. That said, the program did catch “more perfect”—but this one was in dialogue. 
Oh, and it said it didn’t find any “Clichéd similes/comparisons.” ♥♥♥
I didn’t find useful information under “Commonly confused words and phrases,” but many writers will probably appreciate this section. The app captures proper names and variances in capitalization as well. It listed word counts of various kinds, like most frequently used, most frequently used word trios, and most frequently used to open sentences. In my first-person text, “I” opened 1329 sentences compared to “He” (645) and “The” (435). Probably not a problem, but maybe worth a look.
In short, this is a FREE, rapid-acting tool that does provide interesting insights into my writing habits, offering me the chance to save a copy editor some work one day—and to produce a better-edited text should I publish this book myself. I recommend.
Like this:
Like Loading...