Recent from talks
Nothing was collected or created yet.
Common English usage misconceptions
View on Wikipedia
This list comprises widespread modern beliefs about English language usage that are documented by a reliable source to be misconceptions.
With no authoritative language academy, guidance on English language usage can come from many sources. This can create problems, as described by Reginald Close:
Teachers and textbook writers often invent rules which their students and readers repeat and perpetuate. These rules are usually statements about English usage which the authors imagine to be, as a rule, true. But statements of this kind are extremely difficult to formulate both simply and accurately. They are rarely altogether true; often only partially true; sometimes contradicted by usage itself. Sometimes the contrary to them is also true.[1]
Many usage forms are commonly perceived as nonstandard or errors despite being either widely used or endorsed by authoritative descriptions.[2][a]
Perceived violations of correct English usage elicit visceral reactions in many people, or may lead to a perception of a writer as careless, uneducated, or lacking attention to detail. For example, respondents to a 1986 BBC poll were asked to submit "the three points of grammatical usage they most disliked". Participants said their points "'made their blood boil', 'gave a pain to their ear', 'made them shudder', and 'appalled' them".[3]
Grammar
[edit]- Misconception: "A sentence must not end in a preposition."[4][5][6]
Fowler's Modern English Usage says: "One of the most persistent myths about prepositions in English is that they properly belong before the word or words they govern and should not be placed at the end of a clause or sentence."[7] Preposition stranding was in use long before any English speakers considered it incorrect. This idea probably began in the 17th century, owing to an essay by the poet John Dryden, and it is still taught in schools at the beginning of the 21st century.[4] But "every major grammarian for more than a century has tried to debunk" this idea; "it's perfectly natural to put a preposition at the end of a sentence, and it has been since Anglo-Saxon times".[8] Many examples of terminal prepositions occur in classic works of literature, including the plays of Shakespeare.[5] The saying "This is the sort of nonsense up with which I will not put"[9][5][b] satirizes the awkwardness that can result from prohibiting sentence-ending prepositions. Associated Press style and Chicago Style both allow this usage.
- Misconception: "Infinitives must not be split."
"There is no such rule" against splitting an infinitive, according to The Oxford Guide to Plain English,[10] and it has "never been wrong to 'split' an infinitive".[11] In some cases it may be preferable to split an infinitive.[10][12] In his grammar book A Plea for the Queen's English (1864), Henry Alford claimed that because "to" was part of the infinitive, the parts were inseparable.[13] This was in line with a 19th-century movement among grammarians to transfer Latin rules to the English language. In Latin, infinitives are single words (e.g., amare, cantare, audire), making split infinitives impossible.[10]
- Misconception: "Conjunctions such as 'and' or 'but' must not begin a sentence."
Those who impose this rule on themselves or their students are following a modern English "rule" that was neither used historically nor universally followed in professional writing. Jeremy Butterfield described this perceived prohibition as one of "the folk commandments of English usage".[14] The Chicago Manual of Style says:
There is a widespread belief—one with no historical or grammatical foundation—that it is an error to begin a sentence with a conjunction such as "and", "but", or "so". In fact, a substantial percentage (often as many as 10 percent) of the sentences in first-rate writing begin with conjunctions. It has been so for centuries, and even the most conservative grammarians have followed this practice.[15][c]
Regarding the word "and", Fowler's Modern English Usage states: "There is a persistent belief that it is improper to begin a sentence with And, but this prohibition has been cheerfully ignored by standard authors from Anglo-Saxon times onwards."[16] Garner's Modern American Usage adds: "It is rank superstition that this coordinating conjunction cannot properly begin a sentence."[17] The word "but" suffers from similar misconceptions. Garner says: "It is a gross canard that beginning a sentence with but is stylistically slipshod. In fact, doing so is highly desirable in any number of contexts, as many style books have said (many correctly pointing out that but is more effective than however at the beginning of a sentence)".[18] Fowler's echoes this sentiment: "The widespread public belief that But should not be used at the beginning of a sentence seems to be unshakeable. Yet it has no foundation."[19]
- Misconception: "The passive voice is incorrect."
It is a misconception that the passive voice is always incorrect in English.[20] Some "writing tutors" believe that the passive voice is to be avoided in all cases,[21] but "there are legitimate uses for the passive voice", says Paul Brians.[22] Mignon Fogarty also points out that "passive sentences aren't incorrect"[23] and "If you don't know who is responsible for an action, passive voice can be the best choice".[24][d] When the active or passive voice can be used without much awkwardness, there are differing opinions about which is preferable. Bryan A. Garner notes: "Many writers talk about passive voice without knowing exactly what it is. In fact, many think that any BE-VERB signals passive voice."[25]
Some proscriptions of passive voice stem from its use to avoid accountability or as weasel words, rather than from its supposed ungrammaticality.
- Misconception: "Litotes or double negation (sometimes called 'double negatives') are always incorrect."
Some style guides use the term double negative to refer exclusively to the nonstandard use of reinforcing negations (negative concord, which is considered standard in some other languages), e.g., using "I don't know nothing" to mean "I know nothing". But the term "double negative" can sometimes refer to the standard English constructions called litotes or nested negatives, e.g., using "He is not unhealthy" to mean "He is healthy". In some cases, nested negation is used to convey nuance, uncertainty, or the possibility of a third option other than a statement or its negation. For example, an author may write "I'm not unconvinced by his argument" to imply they find an argument persuasive, but not definitive.[26]
Some writers suggest avoiding nested negatives as a rule of thumb for clear and concise writing.[27] Overuse of nested negatives can result in sentences that are difficult to parse, as in the sentence "I am not sure whether it is not true to say that the Milton who once seemed not unlike a seventeenth-century Shelley had not become [...]".
Usage
[edit]- Misconception: "Paragraphs must be at least three sentences long."
Richard Nordquist writes, "no rule exists regarding the number of sentences that make up a paragraph", noting that professional writers use "paragraphs as short as a single word".[28] According to the Oxford Guide to Plain English:
If you can say what you want to say in a single sentence that lacks a direct connection with any other sentence, just stop there and go on to a new paragraph. There's no rule against it. A paragraph can be a single sentence, whether long, short, or middling.[29]
According to the University of North Carolina at Chapel Hill's Writing Center's website, "Many students define paragraphs in terms of length: a paragraph is a group of at least five sentences, a paragraph is half a page long, etc." The website explains, "Length and appearance do not determine whether a section in a paper is a paragraph. For instance, in some styles of writing, particularly journalistic styles, a paragraph can be just one sentence long."[30]
- Misconception: "Contractions are not appropriate in proper English."
Writers such as Shakespeare, Samuel Johnson, and others since Anglo-Saxon days have been "shrinking English". Some opinion makers in the 17th and 18th century eschewed contractions, but beginning in the 1920s, usage guides have mostly allowed them.[31] Most writing handbooks now recommend using contractions to create more readable writing,[32] but many schools continue to teach that contractions are prohibited in academic and formal writing,[33][34][35] contributing to this misconception.
Semantics
[edit]- Misconception: "Some commonly used words are not 'real words'."
Common examples of words described as "not real" include "funnest", "impactful", and "mentee",[36][37] all of which are in common use, appear in numerous dictionaries as English words,[38][39][40][41] and follow standard rules for constructing English words from morphemes. Many linguists follow a descriptive approach to language, where some usages are labeled merely nonstandard, not improper or incorrect.
- Misconception: ""Inflammable" can only mean 'flammable'." / "'Inflammable' can only mean 'not flammable'."
The word "inflammable" can be derived by two different constructions, both following standard rules of English grammar: appending the suffix -able to the word inflame creates a word meaning "able to be inflamed", while adding the prefix in- to the word flammable creates a word meaning "not flammable". Thus "inflammable" is an auto-antonym, a word that can be its own antonym, depending on context. Because of the risk of confusion, style guides sometimes recommend using the unambiguous terms "flammable" and "not flammable".[42]
- Misconception: "It is incorrect to use 'nauseous' to refer to a person's state."
It is sometimes claimed that "nauseous" means "causing nausea" (nauseating), not suffering from it (nauseated). This prescription is contradicted by vast evidence from English usage, and Merriam-Webster finds no source for the rule before a published letter by a physician, Deborah Leary, in 1949.[43]
- Misconception: "It is incorrect to use 'healthy' to refer to things that are good for a person's health."
It is true that the adjective "healthful" has been pushed out in favor of "healthy" in recent times.[44] But the distinction between the words dates only to the 19th century. Before that, the words were used interchangeably; some examples date to the 16th century.[45] The use of "healthful" in place of "healthy" is now regarded as unusual enough that it may be considered hypercorrected.[46]
Notes
[edit]- a.^ For example, among the top ten usage "errors" submitted to the BBC was the supposed prohibition against using double negatives.
- b.^ The Churchill Centre describes a similar version as "An invented phrase put in Churchill's mouth".[47]
- c.^ Chicago elaborates by noting Charles Allen Lloyd's observations on this phenomenon: "Next to the groundless notion that it is incorrect to end an English sentence with a preposition, perhaps the most wide-spread of the many false beliefs about the use of our language is the equally groundless notion that it is incorrect to begin one with 'but' or 'and'. As in the case of the superstition about the prepositional ending, no textbook supports it, but apparently about half of our teachers of English go out of their way to handicap their pupils by inculcating it. One cannot help wondering whether those who teach such a monstrous doctrine ever read any English themselves."[48]
- d.^ These authors are quick to point out, however, that the passive voice is not necessarily better—it's simply a myth that the passive voice is wrong. For example, Brians states that "it's true that you can make your prose more lively and readable by using the active voice much more often",[22] and Fogarty points out that "passive sentences aren't incorrect; it's just that they often aren't the best way to phrase your thoughts".[49]
See also
[edit]References
[edit]- ^ Close 1964. n.p. (Front matter.) In a footnote to this text, Close also points to English as a Foreign Language by R. A. Close (George Allen and Unwin, London, 1962).
- ^ Close 1964. n.p. (Front matter.)
- ^ Jenny Cheshire, "Myth 14: Double Negatives are Illogical" in Bauer and Trudgill 1998. pp. 113–114.
- ^ a b Cutts 2009. p. 109.
- ^ a b c O'Conner and Kellerman 2009. p. 21.
- ^ Fogarty 2010. "Top Ten Grammar Myths".
- ^ Burchfield 1996. p. 617.
- ^ O'Conner and Kellerman 2009. p. 22.
- ^ "A misattribution no longer to be put up with". Language Log. 12 December 2004. Retrieved 29 May 2013.
- ^ a b c Cutts 2009. p. 111.
- ^ O'Conner and Kellerman 2009. p. 17.
- ^ O'Conner and Kellerman 2009. pp. 18–20.
- ^ O'Conner and Kellerman 2009. p. 19.
- ^ Butterfield 2008. p. 136.
- ^ University of Chicago Press 2010. p. 257.
- ^ Burchfield 1996. p. 52.
- ^ Garner 2003. p. 44.
- ^ Garner 2003. p. 118.
- ^ Burchfield 1996. p. 121.
- ^ Walsh 2004. pp. 61, 68–69.
- ^ Pullum 2009.
- ^ a b Brians 2009. p. 169.
- ^ Fogarty 2010. "Active Voice Versus Passive Voice".
- ^ Fogarty 2010. "Active Voice Versus Passive Voice".
- ^ Garner 2003. p. 592.
- ^ "double negative". Lexico. Oxford. Archived from the original on June 27, 2013.
- ^ "Politics and the English Language". The Orwell Foundation. 2011-02-16. Retrieved 2023-07-29.
- ^ Nordquist 2011.
- ^ Cutts 2009. p. 112.
- ^ University of North Carolina at Chapel Hill 2011.
- ^ Walsh 2004. p. 61, 67–68.
- ^ O'Conner and Kellerman 2009. pp. 32–34.
- ^ "SJP: English MLA Style Sheet". Archived from the original on 2011-08-30. Retrieved 2012-04-09.. Saint Joseph’s Preparatory School
- ^ [1] Basic Composition.com Archived 2012-01-29 at archive.today
- ^ [2] Illinois Valley Community College.
- ^ Fogarty, Mignon (2008-09-12). "Is "Funnest" a Word?". Archived from the original on 2014-04-27. Retrieved 2012-09-25.
- ^ Vokloh, Eugene (2007-08-23). "Is Not A Word". Retrieved 2012-09-25.
- ^ Dictionary.com. "Conversate"; AllWords.com. "Conversate"; Lexicus. "Conversate".
- ^ Dictionary.com. "Funnest"; Oxford English Dictionary. "Fun" Scrabble Word Finder. "Funnest"; AllWords.com. "Funnest"; Lexicus. "Funnest".
- ^ Dictionary.com. "Impactful"; Oxford English Dictionary. "Impactful"; Scrabble Word Finder. "Impactful"; Collins Dictionaries. "Impactful"; Lexicus. "Impactful".
- ^ Free Dictionary. "Mentee"; Dictionary.com. "Mentee"; Oxford English Dictionary. "Mentee"; YourDictionary.com. "Mentee"; Scrabble Word Finder. "Mentee"; AllWords.com. "Mentee"; Vocabulary.com. "Mentee"; Collins Dictionaries. "Mentee"; Lexicus. "Mentee".
- ^ Brians 2009. p. 124.
- ^ Merriam-Webster 1995. p. 652.
- ^ "Healthful vs healthy". Grammarist. 20 April 2011. Retrieved 2013-06-11.
- ^ O'Coner, Patricia; Kellerman, Stewart (2012-02-24). "Healthy choices". Grammarphobia Blog. Retrieved 2013-06-11.
- ^ Brians 2009. p. 108.
- ^ The Churchill Centre and Museum at the Churchill War Rooms, London 2011. (The original text is italicized.)
- ^ Lloyd 1938. p. 19. cited in University of Chicago Press 2010. p. 257.
- ^ Fogarty 2010. "Active Voice Versus Passive Voice".
Bibliography
[edit]- Bauer, Laurie; Trudgill, Peter, eds. (1998). Language Myths. London: Penguin Books. ISBN 978-0-14-026023-6.
- Bratcher, Dennis (3 December 2007). "The Origin of "Xmas"". CRI / Voice, Institute. Retrieved 10 June 2011.
- Brians, Paul (2009). Common Errors in English Usage (2nd ed.). Wilsonville: William, James & Company.
- Burchfield, R. W., ed. (1996). Fowler's Modern English Usage. Oxford: Oxford University Press. ISBN 0-19-869126-2.
- Butterfield, Jeremy (2008). Damp Squid: The English Language Laid Bare. Oxford: Oxford University Press. ISBN 978-0-19-923906-1.
- Close, R.A. (1964). The New English Grammar: Lessons in English as a Foreign Language. Massachusetts: Harvard University Press.
- Cutts, Martin (2009). Oxford Guide to Plain English (Third ed.). Oxford: Oxford University Press. ISBN 978-0-19-955850-6.
- Bringhurst, Robert (2005). The Elements of Typographic Style. Vancouver: Hartley and Marks. ISBN 0-88179-206-3.
- The Churchill Centre and Museum at the Churchill War Rooms, London (March 2009). "Famous Quotations and Stories". Retrieved 30 August 2011.
- Felici, James (24 August 2009). "To Double-Space or Not to Double-Space". CreativePro.com. Printingforless.com and CreativePro.com31 March 2010.
- Fogarty, Mignon (2008). Grammar Girl's Quick and Dirty Tips for Better Writing. New York: Holt Paperbacks. ISBN 978-0-8050-8831-1.
- Fogarty, Mignon (22 July 2010). "Active Voice Versus Passive Voice". Grammar Girl: Quick and Dirty Tips for Better Writing. Retrieved 28 May 2011.
- Fogarty, Mignon (4 March 2010). "Top Ten Grammar Myths". Grammar Girl: Quick and Dirty Tips for Better Writing. Archived from the original on 13 March 2011. Retrieved 28 May 2011.
- Fogarty, Mignon (2011). Grammar Girl Presents the Ultimate Writing Guide for Students. New York: Henry Holt & Company. pp. 45–46. ISBN 978-0-8050-8943-1.
- Garner, Bryan A. (2003). Garner's Modern American Usage. New York: Oxford University Press. ISBN 0-19-516191-2.
- Howard, Phillip (1984). The State of the Language: English Observed. London: Hamish Hamilton. ISBN 0-241-11346-6.
- Jury, David (2004). About Face: Reviving the Rules of Typography. Switzerland: Rotovision SA. ISBN 2-88046-798-5.
- Lloyd, Charles Allen (1938). We Who Speak English: and Our Ignorance of Our Mother Tongue. New York: Thomas Y. Crowell.
- Merriam-Webster (2011). "Irregardless". Merriam-Webster. Retrieved 27 October 2011.
- Merriam-Webster (1995). Merriam Webster Dictionary of English Usage. Merriam-Webster.
- Nordquist, Richard (2011). "Top Five Phony Rules of Writing". About.com. New York Times Company. Archived from the original on 4 September 2011. Retrieved 8 June 2011.
- O'Conner, Patricia T.; Kellerman, Stewart (2009). Origins of the Specious: Myths and Misconceptions of the English Language. New York: Random House. ISBN 978-1-4000-6660-5.
- O'Conner, Patricia T. (2009). Woe is I: The Grammarphobe's Guide to Better English in Plain English (Third ed.). New York: Riverhead Books. ISBN 978-1-59448-890-0.
- Pullum, Geoffrey K. (17 April 2009). "50 Years of Stupid Grammar Advice". The Chronicle Review. The Chronicle of Higher Education. Retrieved 28 May 2011.
- Spencer, David (24 May 2011). "The Curious Misconception Surrounding Sentence Spacing". Type Desk. Matador. Archived from the original on 10 June 2011. Retrieved 27 May 2011.
- Strizver, Ilene (2010). Type Rules!: The Designer's Guide to Professional Typography (3rd ed.). New Jersey: John Wiley & Sons. ISBN 978-0-470-54251-4.
- University of Chicago Press (2010). The Chicago Manual of Style (16th ed.). Chicago: Univ. of Chicago Press. ISBN 978-0-226-10420-1.
- University of Chicago Writing Program. "Grammar Resources". University of Chicago Writing Program. University of Chicago. Archived from the original on 26 October 2011. Retrieved 25 October 2011.
- Walsh, Bill (2004). The Elephants of Style: A Trunkload of Tips on the Big Issues and Gray Areas of Contemporary American English. New York: McGraw Hill. ISBN 978-0-07-142268-0.
- The Writing Center. "Paragraph Development". University of North Carolina at Chapel Hill. Retrieved 27 May 2011.
External links
[edit]- Patricia T. O'Conner and Stewart Kellerman (2003). "Grammar Myths". Grammarphobia.com. Retrieved 4 June 2011.
- Richard Nordquist (2011). "Is It Wrong to End a Sentence in a Preposition?". About.com. New York Times Company. Archived from the original on 24 August 2011. Retrieved 8 June 2011. Lists additional published sources that comment on ending a sentence with a preposition.
Common English usage misconceptions
View on GrokipediaGrammatical Myths
Preposition Stranding
The misconception that prepositions cannot appear at the end of a sentence, known as preposition stranding, stems from 17th-century efforts to model English grammar on Latin, where prepositions typically precede their objects and cannot be stranded.[3] This prescriptive rule gained prominence through poet and critic John Dryden's 1672 essay critiquing earlier writers, including Ben Jonson's line from Catiline (1611): "The bodies that those souls were frighted from," which Dryden deemed ungraceful for placing the preposition at the end.[4] Dryden's objection, influenced by Latin syntax, popularized the idea despite English's Germanic roots allowing more flexible word order, including stranding in relative and interrogative clauses.[5] Preposition stranding has long been a natural feature of English, appearing in the works of esteemed writers well before Dryden's critique. William Shakespeare employed it extensively, with his complete works containing numerous instances of sentences ending in prepositions such as "up," "to," and "on" (e.g., "We are such stuff / As dreams are made on" from The Tempest).[6] Another representative example is constructions like "the man I spoke of," which demonstrate stranding in relative clauses without compromising clarity or elegance.[7] In contemporary English, major style guides affirm the acceptability of preposition stranding, particularly when it enhances readability and natural flow. The Chicago Manual of Style (17th edition) explicitly states there is no rule against ending a sentence with a preposition, recommending it over awkward pied-piping alternatives unless the context demands formality.[8] For instance, in relative clauses, "the house that I live in" is preferred for its directness over "the house in which I live," which can sound stilted.[3] While some formal academic or legal writing may avoid stranding to maintain a elevated tone, it is standard in informal and general prose, reflecting English's idiomatic evolution. This flexibility parallels considerations in adverb placement, such as with split infinitives, where natural expression trumps rigid rules.[5]Split Infinitives
A split infinitive occurs when an adverb or adverbial phrase is inserted between the infinitive marker "to" and the base form of the verb, as in "to boldly go." The misconception that such constructions are grammatically incorrect stems from an artificial rule imposed on English in the 19th century, despite their natural occurrence in the language for centuries. This prohibition lacks foundation in English's inherent structure, where infinitives consist of two words unlike the single-word infinitives in Latin, which early grammarians sought to emulate.[9][10] The rule gained prominence through Henry Alford's 1864 book The Queen's English, where he decried split infinitives as "entirely unknown to English speakers and writers" and lacking "any good reason," advocating avoidance to align English more closely with classical languages. Alford's influence amplified earlier prescriptive efforts, but historical evidence shows split infinitives appearing as early as the 13th century in Middle English texts, including two instances in Geoffrey Chaucer's works. By the early modern period, usage became rarer—absent entirely in the King James Bible (1611)—but reemerged in writers like Daniel Defoe and Lord Byron, demonstrating the construction's persistence despite prescriptive objections.[11][12][13] Modern style guides overwhelmingly reject the absolute ban on split infinitives, recognizing their value for clarity and emphasis. A quintessential example is the Star Trek opening narration, "to boldly go where no man has gone before," introduced in 1966, which places "boldly" to stress the manner of exploration and has since become emblematic of the construction's acceptability. The Associated Press Stylebook explicitly permits splitting infinitives or compound verbs when necessary "to convey meaning and make a sentence easy to read," prioritizing natural flow over rigid adherence to outdated rules.[14][15] Splitting an infinitive often avoids awkward phrasing or unintended shifts in emphasis; for instance, "to go boldly" might imply the boldness applies only to the going, whereas "to boldly go" underscores the adverb's modification of the entire action. This flexibility parallels English's tolerance for preposition stranding, allowing word order to serve idiomatic expression rather than classical imitation.[9][10]Conjunctions at Sentence Beginnings
A common misconception in English grammar holds that sentences should never begin with coordinating conjunctions such as "and" or "but," viewing it as a mark of poor writing or incomplete thoughts.[16] This prescriptive rule emerged in the 19th century among schoolteachers, who sought to discourage students from producing run-on sentences by overusing conjunctions to link ideas without proper punctuation.[17] However, this prohibition lacks foundation in the language's historical or structural rules and contradicts longstanding usage in both spoken and written English. In reality, starting a sentence with a coordinating conjunction has been grammatically acceptable since at least the 9th century, as evidenced by early English texts, and it serves to create emphasis, signal transitions, or mimic natural conversational flow. Literary works frequently employ this construction; for instance, Jane Austen's Pride and Prejudice opens with a sentence beginning "It," but subsequent dialogue includes lines like "But you must not expect me to adopt your inexplicable, fantastic, petulant, fastidious ways," demonstrating its role in character speech and narrative rhythm.[19] Similarly, the King James Bible extensively uses initial conjunctions, with 12,846 sentences starting with "and" and 1,558 with "but," reflecting the Hebrew and Greek originals' connective style (waw and kai) for continuous storytelling.[20] Modern style guides affirm this practice as valid and often beneficial. The Oxford English Dictionary and its associated resources, through Oxford University Press, explicitly state that beginning sentences with "and" or "but" is not erroneous and can enhance readability by varying sentence structure.[17] The MLA Handbook similarly endorses it, noting that while some view it as informal, it is not incorrect and aids in connecting ideas across sentences, as in essay examples like "But I digress" to refocus the reader.[21] These authorities emphasize its utility in academic and professional writing for emphasis or narrative cohesion, countering the outdated schoolroom edict. Although permissible, overuse of initial conjunctions should be avoided in highly formal contexts, such as structured lists or legal documents, where it might appear choppy or overly casual; even then, it remains grammatically sound rather than prohibited.[17] This technique can complement other stylistic choices, like passive voice, to achieve varied sentence rhythm without rigid constraints.[21]Passive Voice Usage
A common misconception in English usage holds that the passive voice should be avoided entirely in favor of the active voice, as it supposedly renders writing weak, vague, or evasive.[22] This view gained prominence through William Strunk Jr. and E.B. White's The Elements of Style (first published in 1918 and revised in 1959), which asserts that the active voice is "forceful and clear" while the passive is to be used only when the performer of the action is unknown or unimportant.[22] However, linguists such as Geoffrey Pullum have critiqued this advice as flawed, noting that Strunk and White misidentify non-passive constructions as passive (e.g., labeling "There were a great number of dead leaves lying on the ground" as passive) and overlook the passive's legitimate syntactic and rhetorical roles.[22] In reality, the passive voice—formed by combining a form of "to be" with the past participle of the main verb (e.g., "The ball was thrown by the player")—enhances objectivity and focus in specific contexts, countering the blanket prohibition.[23] The passive voice has been integral to formal English genres like legal and academic writing since at least the 18th century, when grammarians began standardizing its use for impersonality and precision.[24] In legal English, passives emphasize actions over agents, as seen in statutes where the focus is on obligations or outcomes rather than specific enforcers; for instance, analyses of modern UK legislation reveal passives in about 35% of verbs, a pattern rooted in historical efforts to maintain formality and universality.[25] Similarly, in academic and scientific prose, the shift from active voice dominance in the 18th century to widespread passive use in the 19th and early 20th centuries allowed writers to prioritize procedures and results, such as "The solution was heated to 100°C," without foregrounding the researcher.[26] This convention persists because passives suit scenarios where the agent is irrelevant, unknown, or collectively understood, promoting clarity in objective reporting. Consider the example "Mistakes were made," a passive construction often employed in political or corporate statements to acknowledge errors without specifying responsibility, thereby evading direct blame.[27] In contrast, when the actor is known and relevant, the active voice provides greater vigor and accountability: "The team made mistakes" explicitly identifies the subject.[22] Such choices highlight that neither voice is inherently superior; the passive excels when the action or recipient takes precedence, as in "The data were analyzed using statistical software," where the method's application matters more than who performed it. Style guides like the American Psychological Association (APA) endorse selective passive use, recommending it when the focus is on the action or object rather than the subject, while favoring active voice for conciseness elsewhere.[28] APA's Publication Manual (7th ed., Section 4.13) advises consistency within sections—e.g., passives in methods descriptions ("Participants were recruited via email") but actives in discussions ("We found significant differences")—to balance objectivity and readability.[29] This nuanced approach underscores the passive's value in professional writing, where it can even integrate with sentence-initial conjunctions to build complex, logical arguments without sacrificing precision.[28]Double Negatives
A common misconception in English usage holds that double negatives—constructions employing two or more negative elements in a single clause—are invariably incorrect and logically equivalent to a positive assertion. In reality, such structures have deep historical roots in the language and serve specific rhetorical or emphatic purposes, though they are often discouraged in formal standard English for the sake of clarity. This belief stems from an overapplication of mathematical logic to grammar, where two negatives cancel to produce a positive, but linguistic negation functions differently, allowing multiples to reinforce rather than negate each other.[30] Double negatives trace back to Old English, where they were a standard means of intensifying negation rather than canceling it, a practice carried into Middle English. For instance, Geoffrey Chaucer's The Canterbury Tales (c. 1400) frequently employs multiple negatives for emphasis, as in the General Prologue's description of the Knight: "He nevere yet no vileynye ne sayde" (meaning he never said anything rude). This reinforcement of negation contrasted with later influences from Latin grammar during the Renaissance, which promoted single negatives as more "logical," leading to the decline of the form in standard written English by the 18th century. Unlike modern propositional logic, where ¬(¬p) ≡ p, historical English treated multiple negations as additive for emphasis, not subtractive.[31][32][33] In contemporary standard English, double negatives appear in acceptable forms like litotes, an understatement that affirms a positive by negating its opposite, often creating a modest or ironic tone. Examples include "not uncommon" (meaning frequent or common) or "not unhealthy" (implying reasonably healthy), which avoid the emphatic slang of nonstandard varieties while enhancing rhetorical subtlety—sometimes akin to passive constructions for understated emphasis. By contrast, in dialects such as African American Vernacular English (AAVE), double negatives function as negative concord, where multiples intensify the negation without altering its meaning, as in "I ain't got none" (meaning I have none) or "Ain't nothing wrong" (meaning nothing is wrong). This usage is rule-governed within AAVE and not a logical error, though it diverges from standard English norms.[34][35] Major style guides, such as Merriam-Webster's Dictionary of English Usage, classify emphatic double negatives (e.g., "didn't see nothing") as nonstandard but acknowledge their historical legitimacy and dialectal validity, advising avoidance in formal writing primarily to prevent ambiguity rather than deeming them grammatically invalid. Similarly, The Chicago Manual of Style (17th ed.) recommends rephrasing multiple negatives for clarity, noting that they can imply unintended positives or confusion, as in "can't help but not agree," but does not prohibit them outright in all contexts. These guidelines prioritize readability over rigid prohibition, recognizing double negatives' role in expressive language.[36]Writing and Style Conventions
Paragraph Length Requirements
A common misconception in English composition teaching holds that paragraphs must consist of multiple sentences, typically three to five, to be structurally valid. This notion stems from rigid pedagogical approaches in early 20th-century writing instruction, which emphasized formulaic structures like the five-paragraph essay to teach organization, but it lacks foundation in English grammar or style conventions.[37][38] In reality, there are no grammatical rules dictating a minimum number of sentences per paragraph; paragraphs are units of thought defined by thematic unity rather than fixed length.[39] Single-sentence paragraphs are not only permissible but serve deliberate stylistic purposes, such as creating emphasis, pacing dialogue, or delivering impactful statements in journalism and narrative writing. For instance, Ernest Hemingway frequently employed short paragraphs, including single-sentence ones, to heighten tension and clarity in works like The Old Man and the Sea, contributing to his minimalist style that prioritizes rhythm and reader engagement over elaborate elaboration.[40] Style guides endorse this flexibility, advising that paragraph length should align with the flow of ideas, allowing brief constructions for emphasis without prescriptive limits on sentence count.[41] Historically, the Declaration of Independence exemplifies varied paragraph lengths, with many grievance sections consisting of single sentences to underscore specific indictments against the British Crown, enhancing rhetorical force.[42] Writers should employ single-sentence paragraphs judiciously for transitions between ideas, punchy conclusions, or dramatic effect, as overuse can disrupt readability and dilute impact. This technique complements other devices, such as beginning sentences with conjunctions, to maintain natural flow in prose.[41] Avoiding excessive brevity ensures paragraphs remain cohesive units that advance the overall argument or narrative.Contractions in Formal Writing
A common misconception holds that contractions, such as "don't" or "it's," are inherently informal and thus prohibited in formal writing, including academic, journalistic, and professional contexts.[43] In reality, contractions have a long history of acceptance in various registers of English, and their use in formal writing depends on the specific style guide, audience, and purpose, often enhancing readability without compromising professionalism.[44] Contractions trace their origins to 16th-century English, where they appeared frequently in literature, including William Shakespeare's works, such as elisions like "th'" for "the" and contractions with "is" or "will" (e.g., "she'll" or "there's").[45] By the late 18th and into the 19th century, however, etiquette manuals and grammarians increasingly viewed them as vulgar or overly familiar, leading to a widespread prohibition in formal writing that persisted through Victorian-era prescriptions.[43] This taboo began to erode in the early 20th century, with a notable revival in the 1920s, as evidenced by their unremarked inclusion in H.W. Fowler's influential A Dictionary of Modern English Usage and their growing presence in newspapers and magazines.[43] In contemporary usage, major style guides endorse contractions in most formal writing to promote clarity and natural flow, provided they are not overused. For instance, The Economist Style Guide explicitly permits forms like "don't," "isn't," "can't," and "won't" in non-legal contexts, advising moderation to maintain readability.[46] Similarly, the Chicago Manual of Style does not prohibit them, recognizing their role in conversational yet professional prose.[44] This acceptance extends to academic writing, where contractions such as "it's" appear in abstracts and discussions to avoid repetitive stiffness from full forms like "it is," thereby improving paragraph rhythm.[47] Exceptions persist in highly precise or traditional genres, such as U.S. Supreme Court opinions, where justices generally avoid contractions to uphold a formal tone and ensure unambiguous interpretation, though occasional uses by modern justices like Elena Kagan signal evolving norms.[48]Apostrophe Misapplications
One prevalent misconception in English usage involves employing the apostrophe to form plurals, often termed the "greengrocer's apostrophe," as seen in signs advertising "apple's for sale" or "banana's 50p each."[49] This error stems from a misunderstanding that the apostrophe signals plurality, whereas it serves solely to indicate omission of letters in contractions or to denote possession.[50] Historically, from the 17th to 19th centuries, apostrophes were occasionally used for certain noun plurals, particularly loanwords ending in vowels, but this practice fell out of favor by the 19th century as prescriptive grammars standardized rules against it.[51] The core rules for apostrophe application distinguish between contractions and possessives, a frequent source of confusion. In contractions, the apostrophe replaces omitted letters, as in "it's" for "it is" or "has," contrasting with the possessive pronoun "its," which requires no apostrophe.[52] For possessives, singular nouns take an apostrophe followed by "s" (e.g., "the dog's bone"), while plural nouns ending in "s" take only an apostrophe after the "s" (e.g., "the dogs' bones"); non-"s" plurals add apostrophe plus "s" (e.g., "the children's toys").[52] Style guides, including The Guardian's, explicitly decry the plural misuse, emphasizing that apostrophes never form standard plurals like "apples" or "1990s."[52] Such misapplications persist in informal contexts like public signage during the 2020s, according to linguistic analyses of urban communication. For example, in 2024, the Yorkshire Dialect Society criticized a local council's anti-litter campaign for the sign 'Gerrit in t'bin,' which misused the apostrophe, sparking debate on dialect and punctuation in public communication.[53][54] A 2020 study of English writing errors highlighted apostrophe misuse in possessives as a common issue among non-native users, contributing to broader punctuation inconsistencies in public displays.[55] These trends reflect evolving informal norms, yet authoritative sources maintain that correct usage enhances clarity.[49]Comma Usage Errors
One prevalent misconception in English comma usage involves the so-called Oxford comma, or serial comma, which appears before the conjunction in a list of three or more items, as in "red, white, and blue." This comma is optional in many contexts but is recommended for clarity to prevent ambiguity, such as distinguishing "I invited my parents, Ayn Rand and God" from "I invited my parents, Ayn Rand, and God." The Associated Press (AP) Stylebook advises against using the Oxford comma in simple series, stating that commas should separate elements but not precede the conjunction before the final item.[56] In contrast, the Chicago Manual of Style requires the serial comma for series of three or more items to ensure precision, though it allows flexibility when clarity demands it.[57] This variation across style guides underscores that no universal mandate exists, countering the myth that the Oxford comma is either always required or strictly forbidden. Another common error, often misunderstood as a minor oversight rather than a structural fault, is the comma splice, where two independent clauses are joined solely by a comma without a coordinating conjunction, as in "I came, I saw, I conquered," which actually fuses clauses that require a conjunction, semicolon, or period for proper separation. A comma splice occurs when a comma links complete sentences that could stand alone, such as "The hat does not fit, it's too tight," and is generally considered a grammatical error in formal writing.[58] However, this practice has historical roots and persists in informal contexts, poetry, or fiction where short, related clauses benefit from the rhythmic pause, challenging the absolute prohibition against it in all prose.[58] To correct a splice, options include adding a conjunction ("I came, and I saw"), replacing the comma with a semicolon ("I came; I saw"), or forming separate sentences. Historically, 19th-century English writing suffered from "overpunctuation," with commas inserted excessively before every subordinate clause or phrase, reflecting a rhetorical emphasis on pauses that cluttered text.[59] This trend, a vestige of 18th-century practices, gave way to modern minimalism by the early 20th century, as advocated in H.W. Fowler and F.G. Fowler's The King's English (1906), which promoted lighter punctuation to enhance readability.[59] Today, commas aid transitions between paragraphs by signaling related ideas without overcomplicating structure. A frequent point of confusion arises with nonessential clauses, which provide supplementary information and must be enclosed in commas, unlike essential clauses that define the subject and require no punctuation. For instance, in "The book, which I read last summer, was excellent," the clause "which I read last summer" is nonessential and thus set off by commas, as removing it does not alter the sentence's core meaning.[60] Conversely, "The book that I read last summer was excellent" uses an essential clause without commas, as it specifies which book.[60] Misapplying commas here can obscure intent, perpetuating the misconception that all relative clauses demand identical treatment.Lexical and Semantic Misconceptions
Evolving Word Definitions
The evolution of word definitions in English reflects the dynamic nature of language, where usage by speakers often precedes formal recognition in dictionaries. This process pits descriptivism, which observes and records how language is actually used, against prescriptivism, which advocates for adherence to established rules and resists change.[61] Descriptivists argue that languages naturally shift over time through semantic broadening, narrowing, or pejoration, driven by cultural and social influences, while prescriptivists view such changes as degradations of purity.[62] In the 20th century, numerous slang terms transitioned to standard English, illustrating this tension; for instance, "cool," originating as African American jazz slang in the 1940s to denote stylish composure, entered mainstream dictionaries by the 1960s as a general term for approval or acceptability.[63] A prominent example of semantic shift is the adverb literally, traditionally meaning "in a literal sense," but increasingly employed hyperbolically for emphasis, as in "I literally died laughing" to convey extreme amusement rather than actual death. This figurative usage, attested since the 17th century, faced purist backlash but gained formal acceptance when Merriam-Webster updated its dictionary in 2013 to include the sense of "in effect; in fact; actually," acknowledging its widespread colloquial application.[64] Similarly, the adjective fun, once strictly a noun, has developed adjectival comparatives like funner and funnest in informal contexts, despite preferences for "more fun" and "most fun" in formal writing. The Oxford English Dictionary recognizes funnest as a valid superlative form in contemporary usage, reflecting descriptivist inclusion of evolving patterns over prescriptivist resistance to irregular morphology. (Note: Direct OED access confirms entry under "fun" adj., with citations from 19th century onward.) Another case involves impact, historically a noun denoting collision or influence since the 18th century, which began verb usage in the 17th century but saw its transitive figurative sense—"to affect strongly"—criticized as nonstandard starting in the 1960s amid corporate and journalistic adoption. By the late 20th century, this verbal form became entrenched in standard English, with dictionaries like the American Heritage Dictionary noting its acceptability despite earlier objections, as it filled a need for concise expression of causal effects.[65] These shifts often intersect with new word formations, where evolving prefixes like "re-" in "reimpact" (to strike again) mirror broader semantic adaptations. Overall, dictionary updates serve as milestones in legitimizing such changes, prioritizing evidence from corpus data over traditionalist ideals.Confusable Prefixes and Suffixes
One common misconception in English usage arises from the prefix in-, which derives from Latin and can function either as a negation (meaning "not") or as an intensifier (meaning "in" or "thoroughly"), leading to confusion in words like inflammable and invaluable. The adjective inflammable, first attested in 1605, means "capable of being easily ignited and of burning quickly," synonymous with flammable, rather than its opposite; this stems from the Latin inflammāre ("to set on fire"), where in- intensifies the action of flammāre ("to flame"). To avoid ambiguity, nonflammable (or non-inflammable) emerged in the 19th century as the clear antonym for materials that do not burn easily, highlighting how the prefix's dual role creates auto-antonyms—words that are their own opposites. Similarly, invaluable, recorded since 1576, denotes something "valuable beyond estimation or measure," not worthless; it combines in- (negating the older sense of valuable as "capable of being valued") with the root, resulting in a positive intensification due to Latin's flexible prefixation. These examples illustrate how historical Latin borrowings perpetuate myths that such terms contradict themselves, when in fact they affirm value or combustibility through emphatic rather than negating morphology. Another frequent confusion involves neologistic suffixes, as seen in mentee, a term for "one who is being mentored" or a protégé, which has been accepted in major dictionaries despite criticisms that it is redundant alongside existing words like protégé. Coined by analogy with the -or/-ee pairing (e.g., mentor/mentee), mentee first appeared in print in the 1940s, gaining widespread use in professional and educational contexts by the late 20th century, and is now standard in sources like Merriam-Webster and the Oxford English Dictionary. Critics argue it is an unnecessary invention, but its inclusion reflects English's productive suffixation patterns, where -ee denotes the recipient of an action, evolving naturally without violating grammatical norms. The adverb irregardless also fuels debate due to its prefix ir-, a variant of in- or non- from Latin, blended illogically with regardless to mean "without regard" or "despite everything," though it is widely labeled nonstandard. First documented in 1795, irregardless likely arose as a portmanteau of irrespective and regardless, with ir- redundantly negating the already negative regard (from Latin re- + gardāre, "to look back"); the Oxford English Dictionary notes its persistence in North American English, particularly informal speech, despite prescriptive advice to use regardless instead. This case exemplifies how prefix redundancy, rooted in Latin etymology, leads to perceptions of error, even as the word's logic follows English's history of morphological blending. Such confusions with affixes often parallel broader semantic shifts, as in the evolving use of literally to mean "figuratively," underscoring English's adaptive nature.Adjectival vs. Adverbial Misuses
One prevalent misconception in English usage involves the interchangeable or incorrect application of adjectives and adverbs, particularly with words whose meanings have evolved or where traditional distinctions are rigidly enforced despite linguistic shifts. Adjectives like "nauseous" and "healthy" are often debated for their precise roles in describing states or qualities, while adverbs such as "hopefully" face criticism when used as sentence modifiers rather than strictly manner adverbs. These issues stem from prescriptive grammar rules that overlook historical and contemporary evidence of flexibility in part-of-speech functions.[66] The adjective "nauseous" traditionally meant "causing nausea," with "nauseated" reserved for the state of feeling nauseous, as derived from Latin roots where "nauseosus" implied inducing sickness. However, by the 20th century, "nauseous" had commonly extended to mean "affected with nausea," a usage now standard in major dictionaries. For instance, the Oxford English Dictionary and Merriam-Webster both recognize "nauseous" as describing the feeling of sickness, reflecting its widespread acceptance since at least the mid-1900s. Purists still advocate for "She felt nauseated" over "She felt nauseous" to maintain the distinction, but empirical data from corpora like the Corpus of Contemporary American English show "nauseous" significantly more common than "nauseated" in this sense in recent decades.[67][68] Similarly, "healthy" is misconstrued as solely applicable to living beings in good physical condition, whereas "healthful" is sometimes insisted upon for things promoting health, such as food. Historically, "healthy" has been used interchangeably with "healthful" since the 16th century, with the Oxford English Dictionary noting early examples of "healthy" for both people and beneficial items. Jane Austen, in Sense and Sensibility (1811), employed "healthy" for people, as in describing a character as "very stout and healthy," illustrating its established application to human vitality long before modern prescriptive debates. Today, "healthy food" is the dominant phrasing in American English, endorsed by style guides like the Associated Press, though "healthful" persists in formal contexts to emphasize causation. This overlap ties briefly into prefix ambiguities, such as "in-" in words like "inevitable," where morphological roles can blur adjectival senses.[69] The adverb "hopefully" exemplifies misuse claims when functioning as a sentence adverb, as in "Hopefully, it will rain," which some argue should be limited to manner ("She waited hopefully"). This sentence-adverbial use emerged in the 1930s but surged in the 1960s, becoming a hallmark of informal English despite early backlash from grammarians who deemed it imprecise. Merriam-Webster and the AP Stylebook have since affirmed its legitimacy as a disjunct adverb expressing hope, with usage frequencies in the Google Books Ngram Viewer showing exponential growth post-1965. Critics' insistence on the "manner only" restriction ignores precedents like "fortunately" and "regrettably," which perform similar roles without controversy.[70][71]Spelling and Orthographic Myths
Vowel Sequencing Rules
The "i before e except after c" mnemonic emerged in 19th-century English grammar textbooks, such as Simon Laurie's Manual of English Spelling (1866), as a simple aid for remembering the order of vowels in digraphs like "ie" and "ei" when they represent the long /iː/ sound.[72] However, the rule's universality is a misconception; a statistical analysis of over 350,000 English words by University of Warwick researcher Nathan Cunningham showed that the rule fails in numerous cases, with "ie" outnumbering "ei" only about 3:1 overall, and the "except after c" exception not significantly altering this pattern.[73] Common exceptions include "weird," "seize," "caffeine," and "foreign," where "ei" precedes without "c" or the sound deviates from /iː/.[74] The rule holds more reliably for /iː/ after "c" (e.g., "receive," "ceiling"), covering about 75% of relevant cases, but linguists emphasize it as a limited mnemonic rather than a strict orthographic law.[74] Words like "neighbor" illustrate further limitations, featuring "ei" without "c" and pronounced /ˈneɪbər/ as a diphthong /eɪ/. This irregularity stems from English orthography's evolution, particularly the Norman Conquest of 1066, which overlaid French spelling norms onto Germanic roots, creating inconsistent vowel patterns without phonological standardization.[75] Modern educators often recommend memorizing exceptions over rigid rule application, as the mnemonic's exceptions undermine its pedagogical value.[74]Indefinite Article Choices
The choice between the indefinite articles "a" and "an" in English is governed by phonetics rather than orthography, with "an" used before words beginning with a vowel sound and "a" before those starting with a consonant sound.[76] This rule applies regardless of the initial letter's appearance; for instance, "an hour" is correct because "hour" begins with the vowel sound /aʊər/, while "a university" uses "a" since "university" starts with the consonant sound /juː/.[77] A common misconception arises from focusing on spelling, leading learners to incorrectly pair "a" with vowel-letter words like "apple," resulting in ungrammatical phrases such as "a apple" instead of "an apple."[78] Historically, both forms derive from the Old English numeral "ān," meaning "one," which initially lacked the vowel-consonant distinction but evolved in Middle English to differentiate based on pronunciation for smoother speech flow.[79] This phonetic basis extends to acronyms and abbreviations, where the choice depends on their spoken form; for example, "an HTML document" is appropriate if pronounced letter-by-letter as /eɪtʃ tiː ɛm ɛl/, starting with the vowel sound /eɪ/, whereas "a UFO" uses "a" for the consonant sound /juː/.[76] Style guides, including the Associated Press (AP) Stylebook, reinforce this sound-based rule, advising "a" before consonant sounds (e.g., "a historic event") and "an" before vowel sounds (e.g., "an hourly rate") to ensure clarity in professional writing.[80] Such errors are particularly prevalent among English as a Second Language (ESL) speakers, who may over-rely on visual cues from their native languages, producing forms like "a university" mistakenly as "an university."[81] Exceptions occur with words featuring a silent initial "h," where the underlying vowel sound dictates "an," as in "an honor" (/ˈɒnər/) or "an heir," despite the letter "h" suggesting a consonant.[82] This phonetic principle also aids in maintaining smooth vowel sequencing during spoken English, preventing awkward vowel-vowel clashes.[76]Homophone Distinctions
Homophones in English, words that sound identical but differ in spelling and meaning, often lead to misconceptions about their usage, with many assuming rigid grammatical rules dictate distinctions rather than contextual application. A primary example is the pair "affect" and "effect," where "affect" functions predominantly as a verb meaning to influence or alter something, as in "The decision will affect the outcome." In contrast, "effect" is chiefly a noun referring to the result or consequence of an action, exemplified by "The effect of the decision was immediate." Exceptions include "effect" as a verb signifying to bring about or accomplish, such as "to effect real change," and "affect" as a noun in psychological contexts denoting observable emotion, like "flat affect." These nuances underscore that while part-of-speech guidelines provide a foundation, context determines precise application, countering the myth of inflexible rules.[83] The set "there," "their," and "they're" exemplifies another common pitfall, frequently mishandled due to their phonetic similarity. "There" denotes location or existence, as in "The keys are over there," or serves as a pronoun to introduce ideas, such as "There is no excuse." "Their" is the possessive form of "they," indicating ownership by a group, for instance, "Their opinions vary widely." "They're," a contraction of "they are," applies in statements like "They're arriving soon." Misuse of these is prevalent in digital writing, where rapid composition exacerbates errors; a 2020 study on postsecondary students' editing strategies revealed that homophone confusions, including this trio, persisted without explicit instructional tools, highlighting their persistence in online and text-based communication.[84][85] Contrary to beliefs in prescriptive "rules," homophone distinctions rely on usage conventions and contextual cues, shaped by English's orthographic evolution rather than grammatical mandates. Many such pairs trace to phonetic mergers in Middle English, when sound shifts like the Great Vowel Shift and vowel consolidations caused formerly distinct pronunciations to converge, creating homophones such as "vain" and "vein" through the late Middle English merger of /ai/ and /ɛi/ into /ɛi/.[86] To aid recall, mnemonics prove effective: for "affect" versus "effect," the acronym RAVEN (Remember: Affect = Verb, Effect = Noun) reinforces primary roles, while noting "effect" starts with "e" for "end result." For "there/their/they're," associate "their" with "heir" to evoke possession, test "they're" by expanding to "they are," and link "there" to "here" for spatial reference. These orthographic strategies, distinct from grammatical analysis, emphasize spelling awareness in preventing misconceptions.[87][88]References
- https://www.[merriam-webster](/page/Merriam-Webster).com/grammar/words-to-not-begin-sentences-with