Forget the Opening, It’s All About the Ending – AI and the Art of Story

What do you remember most about an experience?

Is it the beginning, the middle, or the end?

Taken at face value, we might be tempted to say the beginning – those first few moments of encountering something new.

Or if you’re one of those folks who likes to focus on the journey rather than the destination, you might say the middle.

Or if you think the end matters more than the means then, well, the end.

All of these can be true, but when it comes to certain types of experiences, it seems that the middle – or the “peak” – as well as the end are the two that can leave the most lasting impression on us.

Let’s look at an example.

Say you went to a restaurant and want to talk about the experience with your friends afterwards. What parts of the experience do you usually focus on?

Was it the way the food looked in the menu, the way it tasted in that first bite, or the way it left you feeling when you were done?

If I had to guess, you would probably spend most of your time talking about how the food tasted with your friends – the peak – and whether by the end of the evening you would go back or not.

Granted, you might talk a lot about that first impression if the rest of the experience let you down. But that might just underline the importance of the middle and the end.

The importance of peaks and endings in experiences is something others have noticed, and not just about food.

To give another angle: according to data compiled in the typically insightful newsletter Ariyh, a satisfied customer is one who felt a strong connection and a positive emotional response at the peak and ending of their journey.

So, when it comes to products and services, customers don’t tend to judge the experience based on their first impressions.

At least, not when it was a positive experience.

This had me thinking how the same observation is often true in storytelling.

Think about it.

How common is it for a story to start off slow, but then hit you with an unexpected meaningful peak or maybe a twist, and then a bang of an ending?

More importantly – how often are those the stories you are more likely to remember? And share? And talk about after?

As a note, the flip side is true too. It doesn’t matter how strong your opening is – if your story goes nowhere and ends on a whimper, it’s going to be forgotten.

The importance of peaks and endings cannot be understated for literature.

Good books leave you feeling something.

Take The Great Gatsby.

Those last few lines are killer, but can anyone remember the opening lines of the book?

Or what about the feeling that Call Me By Your Name leaves you with? Not a bad ending but can you honestly say the first half of that book was anywhere near as memorable as the second half?

A lot of detective fiction works on this principle too (particularly Nordic Noirs, which I love). In crime fiction, the peak often happens when the identity of the killer being revealed. It gives you that rush before the (typically) cathartic ending afterwards.

Or perhaps if you’re the type who enjoys reading cozy locked room mysteries, the peak also happens at the ending.

This is also the case with a lot of movies – the ending is what sticks with you after you leave the theatre and recommend it to friends (or not) – not those first few minutes followed by ninety minutes of stink.

The Usual Suspects, The Mist, Se7en, Planet of the Apes, Twelve Monkeys, and countless others take this to heart (and I honestly can’t recall the first scene of most of these).

As a side note: arguably all of Shyamalan’s films have tried to work this formula with their “twist” endings, for better or worse.

Of course, this not a perfect rule by any means.

The Gunslinger by Stephen King breaks the rule. That book is commemorated for having one of the strongest opening lines in history, but I honestly couldn’t tell you how it ended.

Nevertheless, for writers and storytellers, there is certainly something to be noted here.

While the first lines of a story should pull the reader in (or the first scene, draw in the viewer), if the peak and ending don’t resonate, it won’t matter how much time you spent crafting that opening.

I wish I could say that I was an ardent follower of this philosophy in my own work. In my own writing, I often spent way too much time getting caught up in beginnings that I overlook how to cap off a story.

(Sigh. Pointers for the future)

But wait, what does this have to do with AI?

We’re getting close to a turning point or perhaps crossroads with AI and storytelling.

AI has already figured into discussions about storytelling – most notably as one of the sticking points in the recent WGA strike.

Small to medium publishers have also taken steps to address the use of AI in storytelling – many of them including newly added captions to their submission guidelines informing submitters that any work found to have been produced in whole or in part by AI will be rejected.

By and large, storytellers do not want AI encroaching on their space nor their trade.

Fortunately, and I can say this with some confidence, in it’s current state, AI cannot replace storytellers.

But that doesn’t mean there’s a possibility that it won’t in the future.

As Large Language Models (LLMs) devour more and more fiction, film scripts, and anything else they can get their crawlers on to effectively “learn” more about how a story is composed, there a chance that in the near future the next bestseller will be composed entirely by an AI.

Only, I suspect it will be more of a curiosity at first than a genuine replacement.

If we consider the discussion we’ve had so far about the importance of “peaks” and “endings” in creating emotional attachment, resonance, and memorability, AI is at a serious disadvantage.

Sure, an AI can likely come up with a complete twist ending that comes out of left field and surprises the hell out of us.


Because AI doesn’t understand emotional resonance, it can likely never deliver a twist or story peak that hits with the same power as one crafted from actual human experience.

At best, I suspect that the peaks, twists, and endings an AI produces might come out as feeling either ho hum, random, or largely nonsensical.

The simple fact of it, crafting a story is one thing, but crafting a good story is seriously hard work.

Anyone can write an unmemorable story. Hell, half of our days are likely filled with unmemorable moments.

But to piece together the setup, the peak, and the ending that stays with us – the ones that master storytellers who fundamentally understand the human condition and its weaknesses produce – that’s an altogether different matter.

At this point, I haven’t seen anything that suggests an AI can give us a journey like the best ones out there. In that regard, writers – human writers – have an ace up their sleeves.

However, the bigger problem, is that we might get to a point where no one cares.

We might get to a point where the bare minimum AI regurgitated story might be good enough – or worse, become the norm.

I feel that’s the bigger danger of unchecked use of AI – the real possibility of low quality shovel ware depth stories being more profitable for the people distributing them than human crafted works.

Whether it’s movie producers willfully churning out crap to save a few bucks from paying a screenwriter, to publishers flooding the market with a new breed of AI produced dime novels, one should never underestimate man’s (or woman’s) desire to turn around a quick buck.

If people allow themselves to be willing participants and consumers of AI stories – beyond the initial novelty – that’s where the real danger to storytelling lurks. When we simply no longer care.


Posting Music, Mike Ensign, and Please Don’t Send Me Home On a Stretcher

Since July, I’ve been periodically dipping into an old folder on my hard drive that contains electronica tracks I put together back in my early 20s (circa 2008-10).

I’d more or less forgotten about most of them, as it was around that time that I was starting grad school and the time I could devote to my more creative hobbies started to dwindle.

(and if I recall correctly, it was also where I was suffering from horrible insomnia and carpal tunnel pain, causing me to quick making music on the guitar as well).

In any event, I’ve been steadily uploading these more or less “lost” tracks under the moniker Mike Ensign since then.

One of the tracks I published just the other, for reasons known perhaps only to the deities running and managing the “suggested videos” algorithm, more or less exploded, gaining over 1,000 views in a 24 hours.

Considering most of my other uploads have earned roughly 1-2 (or a big 50-60), breaking 1K feels like a lovely little milestone.

To celebrate, I decided on doing some shameless self promotion.

Here’s a link to it on YouTube:

Here’s a listing I also quickly put together on Bandcamp for it where it can be purchases for $0.99 or whatever people want:

Perhaps this was just one anomaly, or maybe I hit a good stride in optimizing my videos to get some views on them. We’ll see!


Robo Journalists: Will AI Replace Journalism As We Know It?

Imagine you’re on a website, reading the news about the recent passing of a major figure in the tech industry.

You discover details about when they passed, the names of some companies they were attached to, and which famous people shared their heartfelt condolences.

It’s only when you reach the end of the that you realize the author never mentioned their name.

Stories like this have started appearing on sites across the internet. At first glance, they seem like normal news pieces, but when you look closer, you realize something is off with the details.

The reason, of course, is that it was written by an AI.

Do Journalists Use AI?

AI has been participating in the newsroom for longer than we might expect.

According to What’s New In Publishing, newspapers such as the Washington Post have been using generative AI for the past several years to write hundreds of articles.

To be clear: rather than write whole articles, in-house bots have been condensing articles, creating short alerts of high impact news, and even composing the equivalent of Tweets about sports games.

In these cases, their use has largely been like that of a copy editor, helping churn out micro-content at a rate faster than most people could. Most us have have likely never noticed because the use was subtle and, if I may say so, appropriate.

After all, AI can be a powerful in the hands of the right person.

Due to its limitations at the present moment, there is the constant need for a human to either supervise the delivered product or set its operational parameters in such a way to constrain appropriately without letting it go off the rails.

Where things get tricky is when AI is used to write articles wholesale – either from a prompt and headline provided by the editorial team, or to generate everything (topic, title, content) from scratch.

For anyone who has played around with ChatGPT, we all know what can happen if we give an AI too much room to be “creative” – or end up with a hallucination as they are often calld.

In sum, we can easily get bonkers results.

These mishaps can range from misunderstanding the prompt and producing something irrelevant to making things up – such as imaginary guitar chords.

Of course, terming these missteps as hallucinations is a coy euphemism for what’s really going on: the language learning model simply making something up because it sounds, well, like something that a person might write.

Will AIs Be Responsible for Fake News?

At the moment, I’m hesitant to call hallucinations and made up nonsense produced by AIs as fake news.

Fake news is commonly used to describe what happens “news” sites either wholesale makeup stories or write them in a way that the takeaway is entirely misleading.

There’s an intentionality in fake news – whether it’s for creating divisions ahead of a political campaign, driving up support among tribal lines, or simply for exploitatively generating clicks and more ad revenue based on sensational headlines.

The AIs used for generating text do not have intentionality, so it is not as if these tools are responsible for everything they produce.

However, their operators certainly might be inclined to produce fake news, and in their hands these tools have the potential to supercharge their output.

A lot of the fake news out there is produced by the equivalent of sweatshops, where a team of numerous writers in a country with loose labour laws and low costs sits around and hammers out as much content as they can – effectively throwing all kinds of nonsense at the web and seeing what sticks.

Where the worst case examples were essentially limited by how fast a person can type – a bot such as ChatGPT could reduce the time to produce a fake article from say 30 minutes to 30 seconds.

There are some limitations on most language learning tools that will mean a human editor has to go in and shape them up after. But, there is already an “unethical ChatGPT” available on the darkweb that can likely be used to produce more harmful content faste than before.

Will AI Replace Journalists?

The truth of it is that AI is already being used to write news articles and replace journalists.

If it seems like more and more content out there, you’re not crazy – and the rate at which this content is published is likely accelerating.

The Guardian recently identified 50 news websites that appear to be composed of entirely (or close enough) AI generated content. Many of the articles are top 10 listicles, but others also contain advice columns and “news articles” that may not be grounded in actual events.

While most of the sites on that list look like quick traffic grabs or the kind of mid-to-low quality content a typical affiliate site offers, more reputable publishers are also posting articles written by machine learning tools.

The Daily Mirror and Express have both begun publishing articles written entirely by AI, despite claims neither itends to replace its human writers. German newspaper Bild, however, has begun laying off staff as it transitions to more AI.

Technology providers are also doing little to stop the trend.

Google has started testing its own proprietary tool Bard’s ability to generate news articles without human input while Microsoft is offering cash to outlets to begin implementing ChatGPT and other OpenAI tools in their newsrooms.

We’re not at the point where a bot can replace investigative journalism or original on the scene reporting, but the other duties that make up a journalist’s day – such as picking up stories from a wire or commenting on last night’s sport’s game – are already going the way of the dinosaur.