When did you start writing? I can take a guess.
Maybe you are young enough to have found your place in fanfic forums at a formative age. Maybe you were a voracious reader as a child, and upgrading to writing your own stories simply made sense. Maybe you were a shy, studious teen that an English teacher took particular interest, encouraging you to try your hand at a few poems. Maybe your diary was your closest confidant when you were young. Maybe you stumbled into a community of writers on Twitter in your mid-30s and thought, hell, why not? Maybe it was none of these things.
Personally, I never know how to answer that question. I could give you the technical answer, which would be “around kindergarten” like most folks I know. I could tell you when I started coming up with stories, I could tell you when I started committing those stories to paper, I could tell you when those stories became more original content than childish mimicry. I could tell you when I started taking myself “seriously” as a writer; when I started identifying with that word; when I took my first CW101 workshop. But none of those answers feel true.
The truth is, I can remember coming up with narrative scenarios long before I ever learned my ABCs. That’s all writing really is for me—telling stories. I did it with my toys, with my peers, and even by myself in the backseat on long car rides. Telling stories is one of the most fundamentally human behaviors, it comes naturally to us. Written language is nothing more than a tool we invented to help convey those stories—well, that, and as a way to record important information. But if we’re really going down the rabbit hole about it, I could argue it’s all the same practice.
Now: what makes you a writer? That’s a different question entirely.
Some people might say a writer is someone who writes every day. Most writers I know would heartily disagree1. So what do you do when you’re not writing?
For me, it involves a lot of guilt and self-loathing. I don’t recommend this practice, if you can help it. Obviously, no one can be writing all the time. That’s ridiculous. If you were truly writing all of the time, you wouldn’t have any time left over to be inspired. Inspiration comes from being out in the world, giving your brain new input to transform into creative output. And if you aren’t being inspired, your writing will not be very good.
I think, instead, we should define writers as people who write. I imagine that would make nearly everyone reading this a writer. So, congratulations! You’re a writer. You didn’t even have to pay for an MFA.
Writing is not a binary system. It’s not a matter of do or do not, okay? We’re not Jedis2. I know I’m not the first to make this argument, but I do believe it’s a sentiment worth repeating. I certainly don’t feel like much of a writer when I’ve gone seven weeks since opening my manuscript3—nor do I feel like much of a writer when I’m actively writing in said manuscript, because most of what I write the first go-round is an insult to the written word.
And okay, yeah, maybe that’s a little dramatic. But writing is also pretty dramatic, so sue me.
I’m the first to admit that I am not a shy person. I love meeting and befriending other writers, and I’ve been incredibly fortunate in that endeavor. I know and love so many creative people—and I spend an exorbitant amount of that time knowing and loving them convincing them that the art they make is worthwhile; that their identity as an artist is valid; that not having all the answers is a part of the process. And for all the advice and affirmations I dish out, I know I need it back tenfold when I’m the one doubting myself.
This has led me to believe that doubt is nothing more than a part of the process. As a kid in Sunday school, I was taught that doubt strengthens faith—well, maybe it strengthens art, too.
I also believe that doubt is a symptom of Capitalism, and the monetization of art will always, inevitably lead to the question of capital-w Worth. We live in the era of hustle culture, and hustle culture is full of Jedis. If I have to see one more "inspirational” grind-worship post from a skinny white woman trying to sell me MLM smoothies on Instagram, I might start biting people.
Honestly, what drives me insane about “do or do not, there is no try” is that it’s a statement of absolute. There is no try? Are you sure Master Yoda? I’d argue there’s a vast sea of possibility between do and do not. You could kind of, sort of do something. You could do something else entirely. Innovation dies when binary systems prevail; if you can only be one or the other, there’s no room for anything inventive or progressive.
All that having been said, let’s start smashing some binaries.
What’s a Binary?
I’m so glad you asked! Here, I’ll even do the Googling for you.
The part of the definition I’m most concerned with for the sake of this post is the two parts. Any system that implies something exists only in black or white is a binary system, and therefore a target for this argument.
Sometimes, binaries are useful. They’re what make computers work! I’m a big fan of computers, personally. But they’re also what make facial recognition softwares4 so bad at identifying people of color—so, I think it’s safe to call them useful but dangerous.
You may be familiar with this philosophy from a gender standpoint; indeed, if we run in any of the same circles, you probably already know that I identify as a nonbinary on the gender front. I’m not going to waste my time (or yours) explaining exactly how or why I came to use that identity, but I am going to borrow a statistic I picked up from my undergraduate Gender and Life Writing course.
Many people tend to think of gender in the binary sense: you are either male or female. However, recent studies show that up to 1-2% of human beings are actually interesex5—this is about the same overall percentage as people with red hair. So, if you’re thinking, “1% of people isn’t that many!” I’d encourage you to ask yourself, how many redheads do you know?
Insisting that everyone on earth is either a man or a woman is effectively the same as insisting everyone is either blonde or brunette. Ninety-nine percent of anything is not all of it. There will always be people left out when we talk about gender (or hair color!) as strictly this or that.
So instead of thinking about identity as something that must fit a strict categorization, it’s actually much more accurate to view these designations as points on a spectrum. Today, I hope to convince you that everything exists on a spectrum6—from gender, to life, to publishing, to the Force of the universe itself.
Because this is allegedly a craft newsletter, let’s start with the most pervasive binary of the book world.
At some point in your educational career, most likely in elementary school, you were probably taught that there are two types of books—fiction, and nonfiction. You may have struggled with this designation, because the non prefix made you think of not, as in “not true.” But ultimately, if your teacher or librarian was successful, you would have walked away from this lesson with the idea that books about real people and real things were nonfiction, and books that had made-up stories were fiction, or false.
I think poetry is the real redhead of writing under this dichotomy—I challenge you to go into your local bookstore or library and find where poetry is shelves. Is it next to the novels? The memoirs? The essays?
If you were to browse the poetry section at your local bookstore, I doubt you would find it separated between fiction and nonfiction, but rather all mixed together. That’s because poetry, as a form, is much more comfortable defying the idea that something must be fact or fiction.
At my high school, 11th grade English was dedicated to “American Literature.” One of the first books we were assigned was Tim O’Brien’s The Things They Carried. The synopsis of the book reads, “The Things They Carried depicts the men of Alpha Company: Jimmy Cross, Henry Dobbins, Rat Kiley, Mitchell Sanders, Norman Bowker, Kiowa, and the character Tim O’Brien, who has survived his tour in Vietnam to become a father and writer at the age of forty-three.”
“The character Tim O’Brien” was a point of contention for my classroom. The conversation went something like this:
“But Tim O’Brien is the name of the author,” an astute student pointed out.
“Yes,” our teacher agreed.
“And he really fought in Vietnam.”
“Yes.”
“So the book is true.”
“No. If it were true it would be a memoir. This is a novel.”
“So Tim O’Brien is lying?”
“What? No. He’s just not telling a true story.”
“Then why is the character named after himself?”
“Because it’s based on real experiences.”
“Then why isn’t it a memoir?”
Et cetera, ad nauseam, until the bell finally released our poor teacher from this endless psychological rabbit hole.
Here’s another quote from the back of the book: “A classic work of American literature … [a] meditation on war, memory, imagination, and the redemptive power of storytelling.” Word to the wise: if a publisher7 ever wants to call your book a meditation, odds are its giving the marketing department a run for its money.
If you’ve read The Things They Carried, you probably know that whether or not the stories are true is the least important part of that book. They are stories inspired by true things, artfully curated to make a point—that makes them literature instead of autobiography, as far as The Library of Congress is concerned. But the way we remember stories, and the way they inspire us does not conform to the Dewey Decimal system.
This conversation came up at school recently; a student in my cohort was lamenting how dismissive it can feel when teachers and mentors tell us to “try writing memoir” to unpack our lived experiences and trauma. She shared this tweet:
If only Ace_Librarian7 had been available that fateful day in 11th grade! I think we all would have had an easier time.
Another book for another class that same year of high school was Jon Krakauer’s Into the Wild. As far as I’m aware, that book is still being published and advertised as nonfiction (Amazon hilariously lists it in “Travel” as well as “Biography”) despite being debunked several times over. I mention this title as an example of how the wishy-washy designation of “nonfiction vs fiction” wishy washes itself both ways—in fact, it’s been wishy washy since the beginning.
In literary scholarship, there’s a lot of debate over who penned “the first novel in English.” When I was in undergrad, a professor told me it was Daniel Defoe’s Robinson Crusoe—however, a quick Google will show you that the first edition of Robinson Crusoe was not billed as “a novel,” but rather an autobiographical account written by a real person. If you’ve ever had the misfortune of reading Robinson Crusoe, this will explain the weirdly detailed lists and diary entries. Daniel Defoe heard a story about someone who had been shipwrecked and survived, thought, “that’d make a great story!” and proceeded to try and sell it as truth. When it came out that it was all fake, we decided it was fiction all along and added it to all those “100 Books to Read Before You Die” lists. But take it from me—Robinson Crusoe is not worth any time of your one wild and precious life.
A more recent title that beautifully navigates the question of fact vs fiction is Daniel Nayeri’s Everything Sad is Untrue (A True Story). Like Tim O’Brien, Nayeri has written a novel based on his real life experiences—using fiction to strengthen the thesis of the story. The back copy calls it, “a powerfully layered novel that poses the questions: Who owns the truth? Who speaks it? Who believes it?”
Unlike Krakauer and Defoe8, Nayeri is intentionally playing with the dubious nature of fact vs fiction—it’s all right there in the title. It’s proof that we can tell good stories authentically without having to pass them off as entirely made up or entirely true.
As a reader, I find my favorite nonfiction books tend to have some literary elements, especially in terms of narrative—admittedly, that is what made Into the Wild such a good book, before we realized all the problems with it. But it’s also what makes books like The Immortal Life of Henrietta Lacks by Rebecca Skloot and The Library Book by Susan Orlean such page-turners and the lack of that is making Orientalism by Edward Said take me three years to read.
But what does all this have to do with your writing?
When my aunt was in journalism school, she told me “good journalism” should be objective; you can’t let your own internal biases control how you report on an issue. On the surface, that is pretty good advice. However, as many modern journalists are quick to point out, “complete objectivity” is subject to human error, and has historically allowed for the erasure of a lot of important context.
Quick & easy example—do you know the phrase “the bystander effect?” Do you know where it came from?
When I was in middle school, we had a “safety seminar” where we learned all about how to be kind, safe, and responsible citizens of the world (allegedly). In this course, they taught us that the bystander effect or bystander apathy is a theory of psychology which states that when one witnesses an emergency, they are more likely to assume someone else has already called for help than to do it themselves. If you see someone get mugged on a crowded street in New York City, you might assume someone else on the street has already called 9-1-1. But if everyone in the crowd makes that assumption, no one calls 9-1-1 at all, and the mugger gets away!
None of that hypothetical was objective, for the record; but it was presented objectively because in the United States, “safety” relies on a binary code of ethics in which good is good and bad is bad, and if you question that you are also bad now, sorry!
But where did this theory come from? Well, it was first recorded in 1964, after the rape and murder of a woman named Kitty Genovese in Queens, New York. Papers reported that 38 bystanders watched passively as Kitty Genovese was repeatedly attacked, screaming for help and only feet away from her own front door. This debunking has three parts:
There were not 38 witnesses to this murder, which took place in the middle of the night. The cops collected 38 statements from neighbors—many of whom were asleep at the time Kitty Genovese was attacked.
Kitty Genovese was a lesbian, living with her girlfriend, in a a neighborhood with a massive immigrant population. I have hang-ups about calling the cops when I hear a strange sound in my neighborhood in Portland in the year 2023. I can’t even imagine how complicated the relationship between citizens and police must have been on Kitty Genovese’s street back then. Is it really such a wonder that no one called the police?
Someone did call the cops. In fact, a few people did. The reports were given “low priority” because police assumed it was a domestic dispute—e.g., a husband battering his wife. And again, it’s 1964. You were about as likely to be convicted of spousal abuse as you were to accidentally take your own time-traveling son to the prom9.
If you visit the Wikipedia page for the supposed “bystander effect” today, you’ll find the following addition to the term’s definition (emphasis mine):
Recent research has focused on "real world" events captured on security cameras, and the coherency and robustness of the effect has come under question.[1] More recent studies also show that this effect can generalize to workplace settings, where subordinates often refrain from informing managers regarding ideas, concerns, and opinions.[2][3]
It’s almost like citizens under an authoritarian state are more likely to fear the authority than trust it. Isn’t that interesting?10
I know I’ve strayed pretty far from how this relates to fiction writing, but hear me out. When we think journalism we think news, when we think news we think nonfiction (those pesky binaries, once again!). It’s my sincere belief that art ought to exist in context of the rest of the world, and if you’re writing about humanity in any sense—even fictional—you’re engaging, however lightly, in journalism.
There is no denying that the story of Kitty Genovese is a tragedy. But isn’t it interesting how vastly the story changes depending on how it’s told?
I have a couple more false binaries of the book world to unpack with you, but this has already gone on long enough. Thanks for sticking with me ‘til the end, if you did. I’ll be back next week with part 2.
You are good. All my love,
Alex
This is one of those quotes that gets misattributed to everyone from Cicero to Madonna, but I owe the inspiration for this argument to author Martine Leavitt, who spoke about it at length in the 2022 commencement ceremony at my college. Martine is a sage, and I highly recommend checking out her website if you’re on the hunt for writing advice.
Nor do we deal in absolutes, but I’ll get to that later.
Of course, in those seven weeks I also decided to start a newsletter which I am currently writing in this very moment! So, maybe there is something to be said about the action of “writing” being essential to our nature.
I held myself back from going on a complete rant on the meaning of “binary” in the computer sense but, basically, if you want to know more, you should really pre-order FUTURE TENSE by Martha Brockenbrough. And if you want to hear my rant, leave a comment or something and I’ll gladly give it to you.
Little ba-dum tsssss for the neurodivergents in the audience.
We also read In Cold Blood by Truman Capote, billed as “one of the first non-fiction novels ever written” on the back matter. This is a paradox, but one that Tim O’Brien could have benefited from. There’s not exactly a “Nonfiction Novels” section at Barnes & Noble though, is there? Put your money where your mouth is, publishing!
I’ll die on this hill. Screw Robinson Crusoe and every professor of literature that has forced their suffering undergrads to read it. It’s racist, orientalist, and dripping with toxic masculinity.
Yes, I do know Marty goes farther back to 1955 in Back to the Future, but it’s my newsletter and I’ll make all the references I want!
Oh snap here I go again, plugging my favorite podcast. But seriously, You’re Wrong About has an amazing Kitty Genovese episode, if you want to hear about it from a real-life journalist.