Who Says What’s Right? The Real Story Behind “Good Grammar
- Yassie
- Jul 16, 2025
- 3 min read
Updated: Jul 19, 2025
At its core, grammar is just a set of shared agreements. But whose agreements? And do they still hold? In the 63rd episode of Creatinuum, “How Valid Is "Good" Grammar? A Brief Discussion on Prescriptivism vs. Descriptivism,” touches on the conventions of grammar.

Who Decides Good Grammar?
Ask ten people to define “good grammar,” and you’ll likely get ten different answers. Some will say it’s about following strict rules. Others might shrug and say it’s whatever sounds natural. The truth is, grammar isn’t just a technical set of laws. It’s a cultural and historical construction. And the debate over what counts as “correct” language often says more about power than it does about clarity.
Prescriptive vs. Descriptive: Two Sides of the Same Coin
The tension at the heart of grammar comes down to two schools of thought:
Prescriptive grammar is rule-bound. It insists on what language should be. Think: no split infinitives, no double negatives, and definitely no “ain’t.”
Descriptive grammar is about how language is actually used. It observes rather than corrects. If people consistently say something in a particular community or context, then that’s how the language lives and breathes.
Neither is inherently better. Each serves a purpose.
Prescriptive grammar is helpful in formal writing where consistency and clarity are critical. But descriptive grammar is what allows language to evolve and adapt across cultures, classes, and generations. It respects regional expressions, cultural dialects, and real-life usage.
The “Rules” Are Newer Than You Think
Many so-called “rules” of English grammar were invented surprisingly recently. Some date back only to the eighteenth century when scholars began publishing grammar handbooks to formalize English for education and elite society. These early grammarians often tried to make English behave more like Latin—even though the two languages function very differently.
That’s why rules like “never split an infinitive” or “don’t end a sentence with a preposition” exist. They weren’t born out of clarity—they were born out of imitation.
And yet, we still carry them today.
When Grammar Policing Becomes Elitism
There’s a reason grammar rules are often weaponized. “Good grammar” has long been associated with education, wealth, and social standing. Historically, being able to read, write, and speak in a “proper” way was a marker of class and often gender.
Correcting someone’s English can therefore carry weight beyond mere clarity. It can feel like a power move. A way of saying, “I know the rules. You don’t.”
But what if the speaker’s English is shaped by a different system altogether?
African American Vernacular English (AAVE), for instance, has consistent grammatical rules that have evolved over time. The same is true for regional, Asian, Latin American, or Filipino Englishes. If we dismiss these as “wrong,” then we're promoting exclusion.
Context Matters More Than Perfection
Should grammar matter in formal essays, reports, or official communications? Absolutely. That’s where prescriptive rules shine. They create a common ground for professionalism and clarity.
But should we correct someone’s casual English in conversation? Or shame dialects used in songs, tweets, or texts? Not unless we’re prepared to ignore the richness and flexibility that makes English the global force it is.
Language is not static. It is diverse, messy, and ever-changing.
So…What Makes “Good” Grammar?
The answer depends on context.
If you're writing a grant proposal or a thesis paper, “good grammar” probably means adhering to a style guide like MLA or APA. If you're talking to your friends, “good grammar” might simply mean being understood.
It’s not about ditching all rules. It’s about knowing when to follow them and why they exist in the first place. Because grammar, like language itself, is ultimately about connection. And connection doesn’t always require perfection.




Comments