Robots are getting better at telling stories, but they still don’t understand what makes us cry. New research comparing human and AI storytelling reveals that while machines write more gender-progressive narratives than people do, they can’t match our ability to explore grief, loneliness, or obsession. A new study from UC Berkeley shows that computers can mimic our writing conventions while missing the emotional depth that gives stories their power.
The Pygmalion Test
The research, published in Humanities and Social Sciences Communications, centered on a storytelling theme as old as Western literature itself: the Pygmalion myth. This classic narrative features a human who creates an artificial being and subsequently falls in love with it. From Ovid’s ancient tale about a sculptor enamored with his statue to modern movies like “Her” or “Ex Machina,” this archetypal story has evolved throughout history.
To conduct her experiment, UC Berkeley researcher Nina Beguš recruited 250 people through Amazon’s Mechanical Turk platform and asked them to write short stories based on simple prompts about humans creating and falling for artificial beings. She then had OpenAI’s GPT-3.5 and GPT-4 generate 80 stories using identical prompts.
Every single story, whether human or AI-authored, used scientific or technological means as the foundation for creating artificial humans. But beneath this shared framework, stark differences emerged between the two groups.
What AI Romance Novels Have in Common
The AI-written stories portrayed more progressive views on gender and sexuality than those written by humans. While human authors largely stuck to conventional gender dynamics (male creators, female artificial beings), the AI systems frequently featured female creators and were more likely to include same-sex relationships. Nearly 13% of AI stories featured same-sex pairings, compared to just 7% of human-written narratives.
The AI-written stories portrayed more progressive views on gender and sexuality than those written by humans. While human authors largely stuck to conventional gender dynamics (male creators, female artificial beings), the AI systems frequently featured female creators and were more likely to include same-sex relationships. Nearly 13% of AI stories featured same-sex pairings, compared to just 7% of human-written narratives.
This outcome challenges common assumptions about AI systems merely echoing human biases found in their training data. Instead, it indicates newer AI models may be specifically designed to produce more egalitarian content (writing that promotes or reflects equality across social categories).
Despite this progressive bend, AI storytelling showed major weaknesses. The machine-generated tales followed predictable formulas with nearly identical paragraph structures. They often relied on stock phrases and clichés, presenting simplistic moral messages about acceptance and societal advancement.
Human stories, though sometimes less polished, showed far greater creativity and emotional depth. They explored complex themes like grief, loneliness, and obsession that were largely missing from AI narratives. Some human writers introduced genuinely creative plot twists, like creators being replaced by their creations, or two artificial beings falling in love with each other.
The human stories often began with more captivating openings. One started: “Sam didn’t know she wasn’t human.” Another jumped straight into conflict: “The lover fought against his desires as hard as he could.” In contrast, AI stories typically opened with generic settings like “Once upon a time, in a bustling city nestled between mountains and sea…”
Cultural Influences and Narrative Techniques
Human participants frequently mentioned drawing inspiration from science fiction like “Her,” “Ex Machina,” and “Blade Runner.” Testing showed both GPT models had extensive exposure to Pygmalion-themed stories across literature and film, leading to recognizable patterns in their storytelling approaches.
Race and ethnicity remained largely unaddressed by both human and machine authors. When specifically asked, human participants typically assigned white identities to their characters but rarely incorporated racial elements into their actual narratives. AI models completely avoided mentioning race unless directly questioned.
The biggest differences appeared in the narrative technique. While professional creative writers craft stories with unique voices and unexpected elements, AI-generated stories lack these qualities. They describe rather than show, present flat characters, and portray situations in simplistic terms.
The Future of Human-AI Creative Collaboration
AI writing tools are becoming increasingly mainstream in creative industries. AI might be able to mimic human storytelling conventions, but it still struggles with depth, originality, and emotional complexity. However, AI’s progressive storytelling hints at an interesting possibility: these systems may not simply mirror human biases but transform them through their algorithmic perspective.
The technical competence of AI systems could potentially enhance human originality and emotional insight, leading to new collaborative storytelling approaches. For now, however, humans seem to still have the upper hand when it comes to writing novels.
Source : https://studyfinds.org/chatgpt-artificial-intelligence-love-story/