Though using a pattern to inform our books’ structure has merit, it may lead us to a troublesome end
There are multiple guides we can follow to properly structure the books we write. Perhaps the most common is the three-act structure, but there are many others as well.
There’s enough to make me dizzy, so I won’t start to list them. Besides, this post isn’t to promote these various models as much as to share my concern about them.
For example, I know that when watching a movie, I should expect a plot twist about three-fourths of the way into the show. The incident may be trivial, could have been telegraphed too much earlier in the movie, or come as an unexpected shock, but one thing is certain: I know that something is about to happen, so I brace for it.
Because I expect this plot twist to pop up, it seldom delights me. I know that this annoyance is just one more hurdle for the protagonist to jump over before I can enjoy the ending—and I better enjoy the ending.
This happens in books too, but because I’ve watched more movies than reading books, I’m more tuned in to it with movies.
While I think it’s important we know about these writing devices and be able to apply them when needed, I worry about slavishly following them.
Why is that?
Computers and artificial intelligence.
Even now computers can write. And it won’t be long before computers will write passible stories and even books. Just enter a couple of characters, a story arc, a conflict, and a few other key parameters. Press enter, and a finished story emerges, following an established writing model.
This technology will one day make most writers obsolete. And I think it will happen much sooner than most people expect.
What computers and AI software will have trouble emulating, however, is the truly creative writers who don’t follow the writing models that the computer programs follow. These writers—and I plan to be one of them—will still be in demand, because computers will struggle to produce a truly creative book that transcends its writing-model programming.