Before we get started, I want to provide the following LIVING LANGUAGE DISCLAIMER: Grammatical rules are not ordained by God. For most of human history, grammatical rules didn't exist. Look at Shakespeare and you’ll see all kinds of usages that would later be considered grammatically incorrect. And here’s the larger point: living languages (as opposed to dead ones like Latin or Neanderthal), change constantly. Each generation frowns upon the new words and phrases of the next generation, but the language keeps rolling on, ever-changing. Today’s slang or ungrammatical speech is tomorrow’s normal usage. Think of language like a webpage that constantly refreshes itself.
WHAT THIS DISCLAIMER MEANS FOR THIS BLOG: Several of the entries you’ll see here involve grammar, by which I mean the grammatical rules that dominated American English in the 20th century. Now, in the 21st century, those rules rule . . . not so much. Because I was educated in the grammar of the 20th century, I’m emotionally connected to it. But nothing on earth stays the same, including grammar. So: any grammatical rules you find here may be outmoded by the time you finish reading. A century from now, those rules will probably look like fossils of extinct forms of speech. In which case, they’ll be an interesting record of what American writers once considered “good English.”
A Writer Is Someone Who Writes
The question, “Should I be a writer?” can be answered with another question: Do I need to write? If your answer is “yes” -- if you need to write -- you should be a writer.
Just as a painter is someone who paints, and a musician is someone who makes music, a writer is someone who writes. It really is that simple.
The writer or painter or musician may not get paid to perform these skills, but the presence or absence of payment has nothing to do with the definition of an artist. Payment defines an artist’s commercial success, nothing more. Terms like “amateur” and “professional” are useless for illuminating what it means to be a writer. They’re technicalities. The day before Shakespeare made his first pound as a playwright, he was an amateur.
A writer is someone who writes. If you consider this a glib statement, it isn’t. When I watch LeBron James play basketball, I’m aware that I’m watching a man who HAS to play basketball. Artists are no different. They do what they do because they must. They are driven to it.
Talent at writing includes the inclination to write. Talent is drive plus aptitude. Which doesn’t mean that drive equals talent anymore than talent equals drive. You can be driven to write without thereby being Shakespeare. Or, you can have the imaginative and verbal gifts of Shakespeare without the drive to write. There is no such thing as a literary genius who never writes.
And I think the opposite is also true. There is no reason why a person with a drive to write cannot become a skilled writer. Gifts may be genetically given; skills are not. Skills can be acquired.
Does this displease you? Is it a let-down to hear the word “skilled” where you’d rather hear “gifted?” Is “skilled” an anti-climax to the drama of your becoming a writer? Is the work of becoming a writer not worth it if, in the end, you will be a skilled, but not an ingenious, revolutionary, or brilliant writer? Are you nodding your head? Are you saying ‘Yes, it’s not enough for me -- if I’m not going to be acknowledged as a literary genius, a talent of gargantuan proportions, a monster of verbalization, then I don’t want to do all that work’? If so, you’re probably not a writer.
But if you are a writer, I’m not saying you won’t become an extraordinary one. You might, and if you do, this will be the reason: by learning the skills of a writer, you will have given yourself a framework in which your native ability – your talent – can emerge. . . .
The digital age has changed how we think about and use written language.
On the one hand, the internet makes the written word increasingly unimportant. The tendency of cybernetics is to replace text with audio-video, because it’s easier to listen to language than to read it, and it’s more fun to watch images than to stare at text. There, in a nutshell, is the history of mass communication in the last hundred years, when radio surpassed newspapers and TV surpassed radio. The written word gave way steadily to electronic media. The internet simply extends this phenomenon, seemingly infinitely, into the future.
On the other hand, the internet has vastly expanded the amount of writing in the world. It has inundated us and saturated us with text. As a result of email and blogs, people who would never have written began writing. By reviving the practice (though not the art) of written correspondence, email singlehandedly undid what the telephone had done to letter-writing years ago. Suddenly, everyone was writing to everyone else, including people who would have been too restless to compose and mail a letter. Email was so quick and easy, it created a whole new demographic of casual writers. And the new correspondence took every conceivable form: personal exchanges between friends, business exchanges between colleagues, political exchanges between voters.
Blogs took the write-it-down trend even farther. Anyone capable of poking a computer keyboard could become, in effect, a published writer. Anyone could bypass the old “quality-control” screening procedures: submitting one’s writing to a publisher, where an editorial board decided on its quality and relevance, then working with an individual editor to improve the text prior to publication.
I hear myself about to hurl a platitude – “there’s good and bad in everything” – (yawn, yawn, snooze) but . . . in this there is. The Good: blogs opened up a writing-arena to people who had been forced to sit on the sidelines and watch a published elite play the game. The Bad: blogs are the unedited thoughts of bloggers. Which automatically differentiates them from Writers. Anyone who deserves to be called a writer understands that their unedited thoughts are the last thing they want to record for posterity. They realize that the finished product they’d like others to read has been sifted and distilled, checked and rechecked; it has been edited.
Wikipedia is a perfect illustration of what the digital age has done to writing. On the upside, the site can be informative about current events (though unreliable for History and other fields requiring systematic research) because it allows anyone to upload relevant information almost as quickly as an event unfolds. On the downside, because the site is so heavily traveled, Wikipedia does real damage to the written word. At best, the writing is acceptable. (Though it’s often dull, because the anyone-can-edit method leads to the Least Common Denominator of style). At worst, it’s bad, not necessarily because it’s grammatically incorrect, but because it fails the fundamental test of good writing: it’s unclear.
This unclarity comes from: 1) the fact that multiple participants stack entries on top of, alongside, and in between each other; 2) the fact that few participants are good writers. Back in the olden times of The Encyclopedia, a book like the Britannica could usually guarantee that its text had been written by a competent, sometimes talented, writer and edited by a skilled editor. This can’t be true of Wikipedia because of its method for inputting data. Oscar Wilde could come along and enter an elegant passage on Drama, only to be followed by an eager Wiki who wants to correct and improve Wilde’s text. Then comes another editor and another until poor Oscar has been reduced to the literary equivalent of generic marketing. I’m visualizing a big yellow box with fat black letters: DRAMA KNOWLEDGE.
To sum up, the digital age has besieged good writing on two fronts. 1) It moves us steadily toward video sites, which deliver content in a more sensory-stimulating way than a text-based site can. 2) It floods us with hasty, poor writing.
As a result, now more than ever, writing is something special. Those of us who aspire not merely to write, but to write well, form a distinctive community. We’re like the clan of dissenters in Ray Bradbury’s FAHRENHEIT 451,who have memorized all the great books banned by their totalitarian society and gather in secret to recite them to the younger generation. But unlike those characters, writers today face a different nemesis, not thought-control but indifference. When T.S. Eliot wrote, “This is the way the world ends, not with a bang but a whimper,” he could have been describing the apathy expressed by that great millennial mantra: “Whatever.”
Writers are the people who care about the future of the word. Writers are the ones who believe that precision in language is as important to our mental domain as precision in carpentry is to our home. . . . .
Art From A Bottle
There is a tenacious myth about great writers: they operate in an altered state of consciousness. Their greatness -- the lyrical flourishes, the verbal dexterity, the imaginative machinations – must come from periods of intensified, uninhibited sensation about both the world without and the world within. In other words, they must be high. Probably on alcohol. Stories circulate about books written in fantastic furies of drunken inspiration. So-and-So wrote his masterpiece, The Plasma Chronicles, in a single 156-hour sitting with nothing but a case of bourbon and two cases of cigarettes astride his laptop. One-thousand, one-hundred and three pages later, his Mac was a smoking ruin but his manuscript was complete and his publisher orgasmic.
In a few instances (very few), something like this does happen to a writer. John Guare described having written SIX DEGREES OF SEPARATION in three days, but he also said that it took him fifty-one years to arrive at those three days. And he wasn’t drunk when he got there.
More often than not, the person who tells you they’ve just finished a W.M.U.I. -- whirlwind manuscript under intoxication – has less in common with Gustave Flaubert than with the writer portrayed by Jack Nicholson in THE SHINING. There’s a lot of junky writing out there written with tremendously appropriate haste.
Yes, it can happen that a writer, like an athlete, gets “in the zone,” that mental space where you can’t miss. When they do, they turn out pages of inspired text with the same amazing consistency of the basketball player who makes ten three-point shots in a row, or the pool player who runs the table, then runs it again as their opponent looks on in grim resignation. There is something daunting about this phenomenon of performance, so depressing when achieved by our competitors, so sweet when demonstrated by us. I think there is an element of “zoneness” to any period of time when we’re writing well, whether it last fifteen minutes or five hours. Being “in the zone” means, at bottom, being extremely focused on a goal, so focused that your mind operates without distraction. It’s the kind of concentration that leaves you feeling transported – if someone interrupts you, it takes a moment to remember where you are – and in that sense it is akin to a mystical experience.
Which is probably why the myth of the drunken writer persists. We know that great artistic performances have a magical quality to them. We don’t know when or why or how they arrive, those outbursts of exquisite creativity. They seem to just . . . happen!
We sense that we cannot force them into existence. Hence the analogy to intoxication. When we drink alcohol, something magical happens (at least for a while, before the malaise, the brawl and the stupor). One moment we’re full of worry and anxiety about our social performance, then – presto! -- suddenly we’re feeling fantastically free and easy, enjoying those around us and enjoying even ourselves. Rapport flows like wine. We have what psychologist Abraham Maslow famously described as a “peak experience,” a feeling of oneness with the world and a heightened sense of our own bright, unbounded potential.
A few years ago, under the influence of the intoxicated-artist myth, I conducted my own little literary experiment. I drank six consecutive shots of Southern Comfort, sat down at my computer, stared out the window looking over the beautifully urban landscape of Hell’s Kitchen, and began writing. It was about 10:45 A.M.
I decided to write a poem, because I suspected I had special poetic gifts that had lain dormant from childhood to middle age, waiting to be awakened . . . now! Unburdened by the bourbon, I wrote without inhibition and without worry. By noon I was done! (The excessive exclamation points are intentional.) The remainder of my stoned time I spent imagining what kind of feedback Billy Collins would give my little gem. I didn’t know Billy personally, but I knew he taught at City College, or some nearby school, and could be easily reached by subway.
I’ve never again appreciated the virtues of the “Delete” button as much as I did that afternoon at about 2:30, when my mini-masterpiece, “Untitled,” joined the other digital flotsam-and-jetsam in my Trash folder, bound for that mysterious zone in cyberspace where non-analog junk ends up. It wasn’t that whatever I’d written was so awful. I don’t even think it was much more awful than what I would have written sober. But the lesson was clear: altering my state of consciousness did not alter my ability to write mediocre poetry.
Every great novelist I’ve heard interviewed on the subject has said that alcohol makes it impossible to write well.
Songwriter Elliot Smith has left us one of the saddest, most eloquent testimonials on this subject. In his great song “Between the Bars,” Smith, who committed suicide in his early thirties, wrote, “Drink up baby, stay up all night/With the things you could do. You won’t but you might./The potential you’ll be, that you’ll never see./ The promises you’ll only make.”
The reason intoxication and great writing don’t mix seems pretty obvious, if you think about it. To do anything well requires concentration. And what is alcohol famous for? I know -- I can’t remember either. But I think it started with the letter “G,” or maybe with the letter “Egma.” . . .
We like to call a person “eccentric” if they display one or more dramatically unique behaviors.
Our “eccentric” aunt lives alone in a big house packed so full of old newspapers, you need a compass to navigate the maze of paper-encased corridors leading from room to room. Our “eccentric” friend will never wear a bathing suit to the beach; he sits in the sun fully clothed, keeping even his shoes and socks on. We laugh, either gently or condescendingly, at the eccentrics around us, not realizing that each of us, if examined closely, displays some kind of (often private) behavior that is dramatically different from the behavior of many other people. And if there were a person who displayed no eccentric behavior, that would be eccentric . . . and sad. Sad, because it is our peculiar behavior, our distinctive ways of acting and thinking, that makes each of us unique, individual, unreplicable.
Why do I take up your valuable time developing this point about eccentricity (a.k.a. individuality)? What does it have to do with writing?
Everything. As you attune your work habits to your own individual preferences, you have an opportunity to do something additional: to think more deeply about your individuality, about the specifics of you, the attributes that make your world and your thoughts different from other people’s. Why is this important? Because good writing is never generic; it’s always specific. The best writing conveys the writer’s individuality at its most specific. The best writing opens a door to one unique, individual mind. The more you’re able to open that door, the more you will produce interesting writing.
I wanted to take an extra page here to dwell on this misunderstood concept, eccentricity, because it will help you remove yourself from the myth of the intoxicated or eccentric artist. As long as you labor under this myth, it will mislead you. It will take you farther from your individuality, not bring you closer to who you are. It will cause you to adopt gestures and poses of eccentricity (including the gesture of getting stoned) in the deluded belief that those behaviors will have a magical effect on you as a writer, that they’ll somehow transport you into the zone where great literature flows like an endless spring.
The one kind of intoxication you ought to indulge is the intoxication of the writer, which is a dramatic passion for language. Great painters are intoxicated by sight. They obsess over color and light and shadow. They see these visual elements with extreme sensitivity and power. Great musicians are intoxicated by sound. They obsess over melodies and harmonies and dissonances. They hear these aural elements with phenomenal acuity. Great writers are intoxicated by language. They obsess over the meanings and sounds of words, and even over the sight of words, the way words look on a page (or a screen).
That should be more than enough intoxication for you. If you marry a passion for language to your own eccentricity (not somebody else’s), you will be a “real” writer.
Of course you may be someone who manages, like some famous writers, both to drink and write, keeping the two activities neatly separated, in which case, on that inevitable day when the separation collapses and the writing interferes with the drinking, you’ll have to give up the writing. But that goes without saying.
Let’s begin with something that may seem small but isn't: Word Placement.
Notice the title of this post: To Write or To Not Write . . . And notice the placement of the word "not" in "To Not Write."
“Ok,” you may be saying, “so what?” Well, there's something not quite right about it.
Where we put that little word makes a difference in how our reader understands what we're saying.
If we say "to not write," we are splitting the infinitive. (The infinitive is "to write," and by inserting a word between "to" and write" we are doing what grammarians call "splitting the infinitive.")
In the bad old days of grammar teachers and corporal punishment in the public schools, you weren’t allowed to split an infinitive. “To think deeply” was correct; “to deeply think” was incorrect. “To leave quickly” was right; “to quickly leave” was wrong.
The main reasons for this rule had to do with clarity and power. It’s clearer and stronger to keep the two parts of the infinitive together than to separate them with other words.
Shakespeare understood this when he wrote, “To be or not to be?” Compare that to: “To be or to not be?” The second version sounds much worse because it’s weaker. In Shakespeare’s day there was no grammatical rule prohibiting split infinitives, so he could have written “To be or to not be?” if he’d wanted to. Why didn’t he? Because the man had an ear.
Today it’s acceptable for a writer to split an infinitive. Sometimes a phrase or sentence sounds better, more natural, if you do. “I wanted to quickly leave the room” might work better in a particular passage than “I wanted to leave the room quickly.” In either case the meaning is completely clear (you’re leaving fast), so which choice the writer makes has to do only with what effect they want to create.
But the case of “not to write” is slightly different. Take the sentence "We want to learn to NOT write junk." By splitting the infinitive, we create a little confusion. Are we learning to NOT write? And the thing we’re NOT writing is junk?
Yes, you probably understand what I mean if I say it this way, but your brain registers the slight confusion caused by my sticking “not” in between “to . . . write.” The more slight confusions your mind encounters during a conversation or during a period of reading, the less engaged you become.
This may seem like a small thing but it points to a larger, extremely important issue for the writer: being acutely attentive to where you place words. Even the smallest words -- “to” or “not” -- can have a significant effect on your reader, and that effect will depend on where you place the words.
Which is why Shakespeare DIDN’T write: “To be or to not be?”
Rule One for the Good Writer:
Do not introduce confusion into your reader’s mind.
(Unless it’s deliberate confusion, as in a novel or a play; and even then, be careful not to go too long before resolving the confusion, or you’ll lose your audience.)
Writing Strategies, Writing Tips, Writing Advice . . .
A strange thing has been happening to pronouns in America. And you know the old saying: Pronouns goeth before the Fall. Or was it Pride?
Back in the 1990s or thereabouts, people, including TV anchors and talk-show hosts (whose English affects millions of viewers), began saying “It happened to he and I.” Before then people of all educational levels were happy to stick with the grammatically correct, “It happened to him and me.”
So what happened to HE and HIM, and SHE and HER, and I and ME?
First, a quick grammar review:
Use “I,” “She,” “He,” and “We” as subjects (the person doing an action).
Use “Me,” “Her,” “Him,” and “Us” as objects (the person receiving an action).
After prepositions, use “Me,” “Her” or “Him,” not “I,” “She” or “He.” Examples: to him; for her; by me; against her; before him; behind me.
I hit the ball. (I’m the subject, the doer of the action.)
The ball hit me. (I’m the object, the receiver of the action.)
She gave the gift. (She is the subject, the one doing the giving.)
The gift was given to her. (She is the object, the one receiving the action, the one being “gifted.”)
He spoke to Monique. (He is the subject, the one who did the speaking.)
Monique spoke to him. (He is the object, the one who was spoken to.)
We drive Sue and Monica to the beach. (We are the subject, the ones doing the driving.)
Sue and Monica drive us to the beach. (We are the object, the ones being driven.)
So far so good. The trouble doesn’t involve the plural pronouns “we” and “us.” Nobody has been saying, “They gave the gift to we.” And the trouble has not involved cases in which there is only one person serving as an object. Nobody is saying “They gave the gift to I,” or “The accident happened to she.”
But the same person who will not say, “They gave the gift to I,” WILL say, “They gave the gift to Jim and I.” Or, “They gave the gift to he and I.”
So: the trouble has involved sentences in which more than one person serves as an object of a verb or preposition.
The real question is: Is this really a problem?
Not necessarily. Why do I say this? If the usage is grammatically incorrect, then isn’t it a problem? Isn’t it wrong?
Well . . . yes and no.
“What?” you reply. “Can’t you give me a straight answer?”
Maybe. The straight answer is, does a change in language cause a problem in communication? If it does, then, yes, we have trouble. We have a problem. But if the change in language doesn’t confuse people, then it’s not really a problem even though it breaks a grammatical rule.
Personally, I hate it when people say, “It happened to he and I.” Why? Because I like the distinct uses of “I” versus “Me” and “He” versus “Him.” To my mind, these distinctions enrich our language.
But the final question about language is always about CLARITY. Is a statement clear, or is it ambiguous? If it’s ambiguous, it’s bad. If it’s clear, it’s OK.
There is nothing confusing about the statement, “It happened to he and I.” I know that something happened, and it happened to me and the male person represented by “he.” If it’s “Jim” we’re talking about, then I know something happened to Jim and me.
An example of a grammatical error that DOES produce confusion is: “I know it’s name.” The correct statement would be “I know its name.”
By adding an apostrophe to “its,” we change the meaning: “it’s” means “it is.” So that sentence is actually saying, “I know it is name.” This is confusing; it makes no sense. So it must be corrected.
As much as I’d like American English to return to the good old days when an “I” was a real “I” and a “Me” stood for something special, I have a feeling that my wish will be denied. Too many people now use the incorrect form, and since the mistake doesn’t cause a clarity problem, it will probably continue and eventually become standard usage.
(This is what happened to “hopefully,” which we all use in a way that is technically wrong: “Hopefully my books will arrive today.” Literally that sentence means: “My books, full of hope, will arrive today.” “Hopefully” has been misused for a very long time as shorthand for “I hope that.” As a result, today it’s virtually correct.)
I think it’s important for Writers not to walk around in a state of indignation about this kind of mistake, as dissonant as it may be to the sensitive ear. A good writer should understand that language is a living thing; it constantly changes. Yesterday’s “no-no’s” can be today’s rule.
For myself: there’s no way I’m going to start saying “They gave the gift to Monique and I” anymore than I’m going to say, “They gave the gift to we.” The first sentence sounds as bad to me as the second. But I’m not going to spend hours digging my nails into my palms when I hear someone else say it.
Last point: why did this change in our language happen? I think it happened the same way language often changes: from the bottom up. I’m pretty sure that social anxiety caused people to use the incorrect forms of the personal pronouns. I think “me” and “him” and “her” sounded “low-class” to people who weren’t sure about the grammar, whereas “I,” “he,” and “she” sounded “proper” and “high-class.” So they began to make the substitutions.
In other words, it’s people trying to present themselves in the best possible light to other people. That’s not a horrible motive. True, there’s an element of pretentiousness there (maybe they’re going to start lecturing their relatives about using a fork when eating fried chicken in a bucket).
But that’s what social anxiety does to we.
This one’s going to be short and sweet. OK, short.
If you want to say something is simple, just say: “It’s simple.” Don’t say: “It’s simplistic.”
Why? Well, for one, it’s wrong. For two, it’s pretentious.
“Simplistic” means an over-simplified explanation. If someone tells you that oil is the cause of all wars in the Middle East, they're being simplistic because there are various causes of warfare in the Middle East. If you tell me that your marriage dissolved because your spouse was a jerk, you’re being simplistic. Your spouse may be a jerk, but that’s rarely the sole cause of a divorce.
Simplistic does NOT mean “simple.”
SIMPLE refers to something that is uncomplicated. If the instructions for installing a light bulb are quick and easy to follow, they’re SIMPLE, not simplistic.
Why did many people start substituting SIMPLISTIC for SIMPLE? Because SIMPLISTIC sounds big and important, whereas SIMPLE sounds, well, simple.
So the next time you want to sound big and important, say ‘transmogrification” instead of “change,” but leave SIMPLE alone.
In the first of these blogs, I talked about word placement, a subject you can’t talk about too much. Why? Because every sentence you write can be written in different ways (lots of different ways if the sentence contains more than five or six words).
Most people pay no attention to word placement. They spill out the words of a sentence like marbles from a bag, letting them land wherever. Even many writers don’t pay as much attention as they should to the question: Where will my words do their work most effectively?
Example. The other day I was writing an email which contained the sentence, “I visited the Dallas Theatre Center yesterday, which Frank Lloyd Wright designed in the late 1950s.”
Before sending, I changed the sentence to: “Yesterday I visited the Dallas Theatre Center, which Frank Lloyd Wright designed in the late 1950s.”
What’s the difference? Why is the revised version better than the original?
It’s better because it’s clearer. Why is it clearer? Because the word “which” comes directly after the thing it refers to. What did Frank Lloyd Wright design? The Dallas Theatre Center. He didn’t design “yesterday.” This point becomes more obvious if we put additional words between “Dallas Theatre Center” and “which.” Watch: “I visited the Dallas Theatre Center yesterday just before dinnertime, which Frank Lloyd Wright designed in the late 1950s.”
When we encounter “which” or “who” or “when” in the middle of a sentence, we automatically look to the word that immediately precedes it in order to know what it refers to. So, here’s a good rule of thumb: keep “which” or “who” or “when” as close as possible to its referent (the thing it refers to).
GOOD: I brought the car to the shop on Friday, when the best mechanic was on duty.
NOT SO GOOD: On Friday I brought the car to the shop, when the best mechanic was on duty.
GOOD: Tomorrow evening I’ll probably see Ray, who gave us that great anniversary gift.
BAD: I’ll probably see Ray tomorrow evening, who gave us that great anniversary gift.
I think I’ll finish this blog before I get too tired and fall asleep staring at my computer, which I’ve enjoyed writing.
Until thirty something years ago, the grammatical rule was: never end a sentence with a preposition.
So, in conversation you might say to someone, “He’s the man Ambrosia voted for,” but in formal writing it had to be: “He’s the man for whom Ambrosia voted.”
“This is the book Alphonse searched for!” had to be: “This is the book for which Alphonse searched!”
The wittiest comment on this rule came (allegedly) from Winston Churchill, who said (more or less; no one knows for sure): “That’s the kind of nonsense up with which I will not put.”
Today nobody stays up at night worrying about the placement of their prepositions. But even though this rule, if followed to an extreme, will produce extremely stilted sentences, it’s still worth knowing. Why? Because it can add power to your writing.
1. He’s the man Ambrosia voted for.
2. He’s the man for whom Ambrosia voted.
Obviously #1 sounds more familiar, more relaxed, more . . . colloquial. Why? Because it is. It’s how we speak. And if you want a particular sentence to sound more familiar, relaxed and colloquial, then you should use this syntax (syntax being the technical term for “word placement” or “sentence structure”).
However: Listen closely to the sound of each sentence, and you’ll hear the strength of #2, which tucks the preposition neatly into its middle.
When “for” comes at the end, it creates a “trailing off” effect, a “. . .”ness. Your ear registers the fact that more words could easily be added: “He’s the man Ambrosia voted for on Tuesday after dinner.” In contrast to this, sentence #2 suggests finality, because it ends with the the completed action: “voted.” Period. Done.
Even if the verb were in the present tense -- “for whom Ambrosia votes” -- the sentence still possesses greater decisiveness, because a verb is by nature much, much stronger (more concrete, more vivid, more DEFINITE) than a preposition.
Sentence #2 creates another interesting effect: it places the preposition (“for”) next to the object of the action (“the man”). What does this accomplish? It emphasizes the object. “... the man FOR WHOM ...” The “for whom” reinforces in our ear and in our mind, the word “man.” Whereas in sentence #1 -- “...the man Ambrosia voted...” -- the word “man” is only briefly touched on the way to Ambrosia and the action she took.
So, sentence #2 gives greater emphasis to both the action and the object of the action, leaving us with a sentence in which the three main parts -- the subject, AMBROSIA; the object, THE MAN; the verb, VOTED -- all stand strong.
A preposition is a relationship word, a connective word. By-for-with-at-on-in-against-of-to-around-near-alongside, etc.
A verb is an action word. So the question is: do you want to end your thought (i.e. your sentence) with an action word, or not? Do you want the action, and the object of that action, to stand out more strongly in your reader’s mind?
Yes, doing so will make your sentence sound more formal. But why is that bad?
Not only is it NOT bad, it’s often great. Why? Because in our society writing has become increasingly informal. As a result, a more formal expression stands out, draws our attention, wakes us from the slumber of all that relaxed syntax.
And that’s a conclusion away from which I will not back.
Let’s face it: apostrophes are annoying.
Apostrophes are the flies of writing, necessary to clean up compositional garbage but annoying because they always end up buzzing around our heads.
Despite the fact that I fully understand the use of apostrophes, I often find myself typing “your” when I mean “you’re” and then having to edit:
“Your going to love the movie!” Backspace, insert little mark after “u,” add “e” to end. Say to self: either type slower (“more slowly,” if I’m talking to myself in a grammatically correct way) or don’t forget that you’re saying You Are going to love ...
Here’s the simplest (not the most simplistic -- see earlier post on that) way to get your apostrophes right: remember that the irritating little mark is what allows you to contract two words into one (didn’t=did not; wasn’t=was not). It also allows you to indicate a missing letter or letters, especially in colloquial writing (‘bout time you called! Let’s rock ‘n’ roll).
If not for the apostrophe, you couldn’t say couldn’t, and you couldn’t say “it’s” or “they’re” or “she’s” or “we’re.”
But if you omit them, you end up with sentences like: “Were going but Arnolds staying home.” You were going? Are you still going? And how many Arnolds are staying home?
Before you go, don’t forget the second use of Mr. Apostrophe: to indicate possession.
Is the book in the possession of Diana? Then it is Diana’s book.
Is the car in the possession of the repo man? Then it is the repo man’s car, until you pay up and it once again becomes your (not you’re) car.
DANGER! Do not use apostrophes to pluralize a word.
more than one bird= birds (not “bird’s”)
more than one enema= enemas (not “enema’s”)
Final deep thought on apostrophes:
Can’t live with ‘em, can’t live without ‘em.
VERY is a word we need, but not as much as you may think.
What can you say about a meal that’s better than good but not quite excellent? You can say it’s VERY good.
And how about that time you sprained your ankle while running? “Painful” didn’t really express how you felt. You needed a VERY.
But if you correctly describe the weather as rainy, and I reply, “That’s very true” -- then your language buzzer should be buzzing. A statement can be true or false, but it makes no sense to say it’s VERY true or VERY false.
Sure, in colloquial speech, there’s a reason I might reply, “It’s very true.” It’s a way for me to reaffirm what you’ve told me, a way to agree heartily with you, like saying “It sure is!” or “You’re really right about that!” In that case the language has a social function; it’s a bonding device.
But for prose writing, there are correct and incorrect uses of “very.”
Here are several examples of incorrect VERY’s:
Sheila was very indignant about that.
Herbie is very miserable today.
He is very infatuated with her.
Why are these wrong?
Take a moment to think about the adjectives that are being modified here: indignant, miserable, infatuated. These words convey a state of mind that is already extreme. Indignation is a state of being VERY offended; misery is a state of being VERY unhappy; infatuated is a state of being VERY attracted. So, if you modify these words with “very,” you’re diminishing their inherent force; you’re implying that there are degrees of “indignant” and “miserable” and “infatuated.” But the entire point of words like these is to convey a feeling of extremity, of “there’s nothing greater than this state.”
Is Hyman Collosal’s latest novel superb, or very superb?
Is Felicity Bumpercrop’s voice angelic, or very angelic?
Is Kenneth Kipplemaster's personality unique, or very unique?
I think the appearance of so many misplaced VERY’s today comes from our weak understanding of the words in our vocabulary. If we don’t realize that SUPERB and ANGELIC and UNIQUE refer to extreme states or to qualities, like Truth and Falsehood, that can’t be increased, then we don’t really understand those words.
Am I right? Or am I very right?
The other day an article in the paper quoted someone on Wall Street who said, “We’re literally seeing millions of dollars coming and going” out of a suspicious investment account. If this were true, then we would be viewing long lines of high-denomination bills as they marched in and out of a bank.
What the man meant to say was, “We’re seeing literally millions of dollars coming and going.” If he had said this, we would know that the monetary figure of “millions” is not an exaggeration. He’s adding “literally” in order to tell us: “Hey, I don’t mean merely ‘a lot’ of money, as in ten thousand or a hundred thousand or even half a million - I really am referring to millions of dollars.“
Rule One about the use of “literally” -- be careful where you place it. Put it before the word you intend to modify.
And Rule Two? Remember what “literally” means.
It means “exactly as stated.”
Are you literally dropping dead of fatigue?
If you say, “I’m dropping dead of fatigue,” there’s no problem, because your reader understands that you’re speaking figuratively, not literally. You’re using a figure of speech -- “dropping dead” -- which translates, in this case, into “I’m beat” (which is also a figure of speech, not a literal statement) or “I’m exhausted.”
But if you say, “I’m literally dropping dead of fatigue,” then you’d better be a couple of breaths from your last.
If things are going really well for you, they’re “coming up roses,” but don’t say “things are literally coming up roses.”
You want to describe someone on the brink of potential disaster, and you write: “Jamie was literally praying for her life.” If I read this, I’m going to assume Jamie actually prayed. I’m imagining her on her knees with hands clasped, or muttering supplicatory words under her breath, or something like that. So if you don’t really intend that meaning, don’t make me go to all this trouble. Or I’ll be literally beside myself with irritation, and the last time that happened, I found I had nothing to say to that guy standing next to me pretending to be me.
In this and the next post, I’m going to talk about the single most important element of strong writing: detail.
This rule applies to all prose writers. Whether you’re doing fiction or non-fiction, the better your details the more lively and effective your writing will be.
The two most common types of detail are: personal and situational. Personal detail means just that: the details that identify a person. Situational detail refers to descriptions of places, environments and interactions.
In today’s blog I’ll confine myself to the topic of personal detail. The next one will address situational detail.
A quick, easy way to start thinking about personal detail is to consider yourself. You’re the person you know best (whether you know it or not).
Most of us don’t realize how thoroughly unique we are. We tend to lump ourselves with other kinds of people whom we resemble (or think we resemble) in behavior and outlook. “I’m an extrovert” or “I’m a Democrat” or “I’m a doer” or “I’m a meditative type” or “I’m a family person” or “I go my own way,” etc.
While any of these labels may apply, they don’t tell much about who you are. What does? Every habitual action you do, especially those that may seem so small or unimportant that you’re not aware of them.
How do you discover your personal details? By answering such questions as:
How many physical mannerisms do you have? (Do you bite your nails, pick your fingers, scratch your face, rub your skin, play with your hair, pick your nose, twiddle your fingers, wiggle your toes, hold your torso erect, slump your torso forward, walk fast or slow or with your toes inward or outward, squint your eyes, adjust your glasses, blink a lot, hold your head still, move your head around a lot, look at people when talking to them or look away, shake hands firmly, avoid shaking hands, whistle, hum, sing to yourself, hold your mouth open, keep your mouth closed, hold your jaw tight or relaxed, click your teeth, lick your lips, hold your stomach in, flex your muscles, speak loudly, speak softly,speak with a lot of emotion in your voice, speak in a detached manner, erupt easily into shouting, withdraw easily into silence, cry easily or rarely, have nervous tics involving your eyes, nose, mouth, ears, brow?) The list, as you can see, is virtually endless. And that’s only one category of behavior: physical habits.
If you think about it, though, you could probably extend the list of your physical habits until it encompassed much of your behavior, if only because we are physical beings. We have bodies and must conduct ourselves in the world through bodily movements.
But to extend your observational list, ask yourself pertinent questions about your daily life. What do you do, hour by hour and minute by minute, when you’re at work? When you shop, how do you shop? (Do you love shopping for clothes but hate shopping for groceries? Do you decide quickly what you want or take a long time deciding? Do you concern yourself with your budget while shopping, or momentarily forget about it? Do you love shopping online but hate going to stores?)
Behind every one of your habitual behaviors, there is a mine of information about yourself. Why do you talk on the phone a lot during work? Why do you love food-shopping but hate clothes-shopping? Etc.
Of course, you could also sharpen your powers of observation by focusing on other people, but you have much less access to the complexity of their world. The exercise of self-observation, by its almost limitless specificity (you could devote your life to the task if you wanted the most refined list possible) tells you what kinds of things to look for in describing a person, whether real or fictional. You become like a detective, noticing the details that others overlook.
You can’t write what you don’t observe. The less you observe, the more generic your writing will be. The more you observe, the better you’ll be at bringing a character, real or imagined, to life.
As promised, we’ll now take a look at situational detail -- the specific characteristics of a place, environment or interaction.
Why are travel books always popular? Because they provide DETAILS about places.
Why are pop psychologists who talk about relationships so popular? Because they provide DETAILS of intimate interactions.
We love to learn the details of other places and other lives.
Mostly, we’re interested in the details of situations involving people. The interior decoration of a home piques our curiosity not in itself but for what it tells us about the inhabitants of the home. The same is generally true of the great outdoors, which is why people prefer to see people in travel photos: this isn’t merely the Grand Canyon -- it’s Monique dwarfed by the Grand Canyon. James trying to hug a Giant Redwood has more impact on us than the tree standing alone.
Obviously we have the capacity to be interested in nature photography or landscape paintings as things of beauty, but writers tell stories (whether they’re true stories or fictional) about people. Therefore, the details of a place or a social milieu interest us for what they say about the people in a story. If an interview with a famous person takes place in that person’s garden, we want to know something about the garden. Is it an English garden? A Japanese garden? What does it look like? What kinds of flowers or plants grow there? Is it carefully tended? If so, by whom? A gardener? A spouse? The famous person? These details enhance our picture of the person to whom the garden belongs.
(Remember that the answers to these questions stimulate new questions: how does our interviewee interact with the gardener, and why? Why does the interviewee prefer this particular kind of garden? Did he grow up in a home with a similar one; does it remind him of a place or person from the past? Does he dislike the garden? Maybe it was created by someone else -- a spouse, a parent, a previous owner -- and he lacks either the power or the will to change it? The questions are as abundant as your curiosity, and they all illuminate a particular person and his relationships to other people and to his own past. )
In writing about an actual person, such situational details allow the author to probe, to uncover elements of personality that might not emerge from what the person says about himself. In fictional writing (a novel, short story, play), these kinds of details allow the author to construct a character more fully.
The 19th-century German philosopher Hegel described possessions as extensions of self. How does your garden or your bookcases or the pictures on your walls or your furniture represent you? All these things represent choices; our choices determine who we are.
In the largest environments, those we don’t own -- forests, fields, rivers, oceans, cities -- we want to know only the details that matter to a story about particular people who are interacting with that environment. For every detail about a lake that matters to our story, there are a thousand details that don’t. Our readers don’t need to know the chemical content of the water in the lake, or the dimensions of the lake, or the kinds of detritus floating on the surface, or the variations in water temperature, or the origin of the lake, UNLESS the chemical content or the dimensions or the detritus or the temperature or the origin (could it be man-made?) matter to the story being told.
If it were your obligation to give the most complete description of each environment in which a person or character appears, you’d never finish your book. You might eventually find a place in the Guinness Book of World Records for “Most Exhaustive Description of the New Jersey Turnpike Between Exits 12 and 14,” but you’ll never finish your book.
So let’s conclude by saying: details always matter, but a few details matter a lot more than the rest. Hence the expression, a “telling detail.” That’s what writers seek, those details that “tell” about the people in a story. If they’re not telling, they don’t belong.
Here's the first of a series of posts on the single most important element of prose writing, regardless of whether it’s a play or a novel or a short story or an essay or an editorial or a nonfiction book. That element is FOCUS.
While the most important element of good sentences is CLARITY, the vital ingredient of any writing project is FOCUS. Your book or your play or your story MUST have a focal point.
At the start, you may not know (you probably won’t know) what your focus will be, but at some point you’ll have to determine what it is, because that is the tool with which you’ll craft your writing.
If you’re not sure why you’re writing, you won’t come up with anything worth reading.
Imagine a photograph without a focal point. You’re not sure what you’re looking at, because everything before your eyes is equally fuzzy. Nothing stands out with clarity. You get bored, or you get a headache. The same holds true for writing.
Great writing is intensely focused. Like a sharp photograph, sharp writing invites us to look at a particular subject in detail. If the writing is an editorial, we’re looking at the reasons why we should believe X. If it’s a novel, we’re looking at a main character or a group of characters, or both, in order to see what happens to them. The same with a play. In the case of a nonfiction book, we’re looking at a thesis, a central argument, which gives us a way to understand Human Evolution or the Bible or the Civil War or Democracy or Fascism or Electricity or Cars.
(Yes, there are coffee table books that give us lists or descriptions of things -- the cars of the 1960s; the plants of California -- but that’s not the kind of writing we’re concerned with, because it doesn’t require either craftsmanship or artistry.)
So, in the next few blogs I’ll take up several types of prose writing -- editorials, fiction, plays, essays, and nonfiction books -- in order to illustrate how FOCUS works in each case.
And here’s a good principle to keep in mind: LESS IS MORE.
By focusing your writer’s lens more and more sharply on a subject, you’ll be able to enter into it more and more profoundly -- you’ll be able to see more and thus say more about it.
This is why we call a weakly focused piece of writing “breezy.” It rushes past, barely disturbing us. It stimulates briefly but makes no lasting impression.
Strong writing is never breezy.
Teachers of novel-writing use a technique that also works well for nonfiction. It’s called “Why should I care?”
You can probably guess the point: you don’t want your readers saying “Why should I care?” about your story. If they do, then your book is missing something major. It’s your responsibility as the author to tell a story (whether it’s fictional or true) that entertains or stimulates those who’ve taken the time and paid the money to read it.
If your nonfiction book or essay has a clear and vibrant focal point, you will pass the “Why should I care?” test. People will care because you made them care. How did you do that? By exciting their interest in your topic. How did you do that? By laying out before them a tantalizing map that they’ve never seen before.
Say your topic is the American Civil War. Readers who are likely to read a book about the Civil War probably have some basic knowledge, however rudimentary, about this subject. Furthermore, they certainly have an interest in History, which means they’ve probably read other historical books and thus acquired at least a little historical sophistication along the way. So, they’re going to be bored by a book that gives a 1-2-3 chronicle of events: first this happened, then that happened, then the next thing happened, etcetera ad seemingly infinitum.
What do they want from you? They want something new and exciting. A new way of understanding why the War happened, or how it was won, or why it’s still important to think about. That new way is your focal point, and it might look something like this:
“Until now most students of the Civil War have assumed that X was its primary cause. But that assumption is flawed. In the pages that follow, I aim to prove that Z was the hidden cause of the War. Once we understand the true impact of Z, we can, for the first time, fully appreciate why our political system collapsed in 1861 and what we must guard against in the future.”
This focal point forms a kind of map for the reader: in the first part of our journey, I’ll show you how most people have thought about my subject; in the second part of our journey, I’ll show what’s wrong with that understanding; in the third and longest leg of our trip (i.e. the bulk of my book), I’ll show you a new and better way to understand my subject; in the final stretch, I’ll show you why my approach is important.
As long as the focal point of my book (or essay) is clear and stimulating, my readers will want to take the trip with me, to see how well I do at leading them to a new understanding of a topic, no matter how familiar that topic may be. Because most topics are familiar (they’ve been written about before, probably a lot), my focal point had better be new and different. Otherwise, why would readers want to waste their time with it? They don’t want to read a book or article that repeats or resembles what they’ve read before.
Keep in mind that a nonfiction book isn’t really “proving” its point. It’s not like a geometry proof, airtight and incontestable. When a nonfiction author says he aims to “prove” that Z was the root cause of the Civil War, he means that, like a lawyer in a trial, he’ll try to convince you that his interpretation is the best one. He’ll do this by presenting solid evidence and impressive reasoning about that evidence (if he’s good). After which, you’ll say to yourself: I can hardly imagine how the Civil War could have been caused by anything else! That’s as close as we get to a “proof” in nonfiction writing: a highly persuasive argument. So persuasive, that we feel like it’s the only good explanation around. Case closed.
If your nonfiction book offers a new and fascinating way to look at a subject in which I’m interested, I Will Care. If you make sure to explain to me, right at the start, WHY your focal point is new and fascinating, I Will Definitely Care. I might even buy it.
You may think that only nonfiction requires a focal point and that fiction, by contrast, can be “free form,” following the artistic whim of the novelist.
Well, it can be . . . but generally it isn’t. Generally, the good novelist, like the good nonfiction writer, maps out a very clear path for her characters to follow.
That path may not be immediately apparent to the reader. In fact, unlike nonfiction, a good novel or short story does NOT tell you at the start where it’s headed. But its author knows where the story is going, and she knows with almost mathematical precision. One of the joys of reading a strong novel or short story is the joy of discovery, the “ah-hah!” moment when you see where the story has been heading all along. Fiction writers use the element of surprise to entertain and delight and shock and provoke. That’s why a great story is never predictable (unlike an uninspired “formula story,” which is absolutely predictable).
The “thesis” of a novel, so to speak, is the drama of its protagonist (or protagonists). Will Oliver Twist escape a wretched fate and find happiness? Will Huck Finn face the challenges (especially the moral challenge) of his adventure on the river with his slave companion, Jim? Will Don Quixote (the book which many consider the first novel) attain (or survive) his chivalric goals? In each of these stories, a main character has a clear mission, a profound desire, an overarching goal -- and he must face a series of obstacles in order to fulfill it. That is the focus of these novels; that is their plot.
Whether a work of fiction centers on one main character, or a couple, or a trio, or an entire community, it should perform the same basic function: dramatizing a powerful, high-stakes struggle toward a fundamental goal (love, justice, freedom, survival, etc.)
A great short story is even more demanding of a strong focus. Unlike a novel, which can traverse hundreds of pages and take many detours in the course of its journey, the ideal short story is about one moment. Many writers say that every sentence of a good short story must build to that single moment. There’s no slack, no space to waste.
That’s what focus is all about.
And if I tell you, “when it comes to FOCUS, playwriting follows the same rule as novels and short stories,” you shouldn’t be surprised.
Why? Because in Western civilization plays MADE the rules for creative writing. The novel didn’t emerge until the 17th, 18th, or 19th century (depending on which scholar you’re reading) but playwriting dates back to the ancient Greeks, whose work we continue to revere and stage. More than two thousand years ago, Aeschylus, Sophocles and Euripides laid the foundation of magnificent story-telling. They did this by creating tight, highly dramatic plots in which we follow the struggle of an individual toward a great, often tragic goal.
Is there any confusion about the focus of Oedipus Rex? A gifted king must find the reason for the plague that besets his city, so he sets out, like a detective, to solve the mystery (and learns, to his and our horror, that the solution is: himself). Or what about Antigone? A young woman who feels in the pit of her soul that she must honor her dead brother by giving him a decent burial, even though the State, having branded him a traitor unworthy of burial, will execute anyone who violates that ban.
Such stories as these dramatize a single, universally understood struggle (for Oedipus, the struggle is with himself; for Antigone it’s with the State). Virtually all great plays do the same thing. What does Hamlet want? To avenge his father’s death. What do George and Martha (of “Who’s Afraid of Virginia Wolff?”) want? To fix an almost tragically destructive marriage.
We understand these desires. They’re universal (the desire for revenge -- to see that evil is punished -- or the desire to fix a broken, intimate relationship, whether between spouses or siblings or friends or parents and children). That’s why, on one level, we continue to write and watch plays about the same basic struggles. The times change, the settings change, but people are still people. Our desires haven’t changed at all, and those desires are the focal points of plays (and, by extension, screenplays -- though these depend more on images than words). Incidentally, though the examples I’ve given here are tragedies or intense dramas, comedies take the same powerful drives (love, retribution, victory, etc.) but spin them toward a different conclusion. (Which is why so many sitcoms revolve around marital or family conflict. It’s a superb focal point.)
Yes, a great play raises all sorts of questions and leaves us thinking about all kinds of things. It may have subplots (i.e. secondary focal points) and a range of characters (each of whom has his own focal desire). But the play itself turns on ONE clear and overriding problem.
Step one for the playwright: determine “What is my play about?” When you come up with one clear answer, expressible in one clear sentence, then you’re ready for the fun (by “fun” I mean, of course, the agony of figuring out how to write the damn thing).