AI, narrative journalism, and the future of the human race

by

9 comments

The development of full artificial intelligence could spell the end of the human race, says Stephen Hawking.
  • Desiree Martin/Getty
  • "The development of full artificial intelligence could spell the end of the human race," says Stephen Hawking.
In a recent Bleader post of mine touting a new travel magazine published by the Smithsonian, editor Victoria Pope made a confession. At an earlier point in her career she'd reported from Germany for the Wall Street Journal and, "I was a disaster at earnings reports, and though I tried I never got any good at it."

There was nothing startling about this admission. Few journalists have a head for numbers, and no journalist I've ever known went into the business to write earnings reports. That's why it's impossible to feel very alarmed by news that in the future no journalist might have to write them. A cover story on AI—artificial intelligence—in the May 9 Economist mentions that a Chicago firm, Narrative Science, not only "hopes to automate the writing of reports" but is already being used by Forbes magazine "to cover basic financial stories."

The Economist takes a measured view of the rise of AI, not fearing the worst but acknowledging that others do. For instance, Elon Musk has rated "the creation of a rival to human intelligence as probably the biggest threat facing the world." In the words of Stephen Hawking, "The development of full artificial intelligence could spell the end of the human race."

I advise ambivalence. On the one hand, what do we have to fear from any intelligence, real or synthetic, that can get suckered into writing earnings reports? On the other hand, what if AI, as it grows up, has a long memory and neither forgets nor forgives?

I'd never heard of Narrative Science until a colleague pointed out the Economist coverage to me, but it's a formidable company. Cofounder Kris Hammond, its chief technical officer, teaches computer science and journalism at Northwestern and earlier founded the Artificial Intelligence Laboratory at the University of Chicago. We're swimming in data, but people don't like numbers, he explained to the Atlantic in 2012. Unless those numbers can be turned into narratives, he said, they do us little good. People need stories.

Hammond may be wrong about that. People might need narrative less than he thinks, and here's an old—OK, story—of mine making this argument. But let's assume he's right. At Narrative Science it's full speed ahead. Its website boasts: "Powered by Artificial Intelligence, Quill is our advanced natural language generation (NLG) platform for the enterprise that goes beyond reporting the numbers—it creates perfectly written narratives to convey meaning for any intended audience." The stories it writes for Forbes are bylined, "By Narrative Science," and an adjacent note to readers advises, "Opinions expressed by Forbes Contributors are their own." Consider that a word of warning. If the Narrative Science algorithm generates opinion as well as narrative, an opinion Quill must simply have decided to keep to itself for the time being is smoldering resentment at never getting assigned a cover story.

But here's a silver lining! Narrative Science boasts that Quill will communicate data in "conversational language." If Quill can be programmed to be chatty, it can also be programmed to be grammatical. Its algorithm could be tweaked to eliminate the most frequently misspelled three-letter word in the English language—led. And it could enforce the grammatical principle that one of those who should be followed by a plural noun, a rule beyond comprehension by the Tribune for reasons that one of its editors once identified to me as a "blind spot" caused by "human failing."

AI takes human failing out of the picture.

Do these small advantages make the "end of the human race" any easier to accept? You might think not. But articles on AI dependably offer reassurance that the end is not yet nigh. The articles mentioned above are no exceptions. The Economist offered this observation:

But even if the prospect of what Mr Hawking calls "full" AI is still distant, it is prudent for societies to plan for how to cope. That is easier than it seems, not least because humans have been creating autonomous entities with superhuman capacities and unaligned interests for some time. Government bureaucracies, markets and armies: all can do things which unaided, unorganised humans cannot. All need autonomy to function, all can take on life of their own and all can do great harm if not set up in a just manner and governed by laws and regulations.

This ingenious analogy is comforting largely because it's clever—and cleverness is an attribute that doesn't come up much when AI's talents are being touted. Cleverness might turn out to be the human race's ace in the hole. But now consider the Atlantic story.

Its focus is Narrative Science and the threat it poses to human journalists. Less than you might think, says reporter Joe Fassler, because it specializes in stories human journalists don't want to write or would never think to cover—such as a Little League baseball game no one cares about but the kids and their parents. "The iPhone app Gamechanger, which coaches and parents use to score Little League games, has a 'recap' service enabled by Narrative Science," Fassler tells us. "Mark the final out and, kapow, you've got a print-ready article about the game. In theory, you could even receive recaps with a personal touch, nine innings retooled around the feats and foibles of your little tyke. . . "

Little tyke? Fassler isn't aware of how much stronger he just made the case for AI. No algorithm worth its salt would have let Fassler's redundancy slip by.

Then again, was little tyke a redundancy or was it a cunning "conversational" slip-up inserted by an advanced intelligence to disarm the reader? Consider the note on which Fassler's story concludes:

Besides, the best journalism is always about people in the end—remarkable individuals and their ideas and ideals, our ongoing, ever-changing human experience. In this, [Narrative Science CEO Stuart Frankel] agrees.

"If a story can be written by a machine from data, it's going to be. It's really just a matter of time at this point," he said. "But there are so many stories to be told that are not data-driven. That's what journalists should focus on, right?"

And we will, we'll have to, because even our simplest moments are awash in data that machines will never quantify—the way it feels to take a breath, a step, the way the sun cuts through the trees. How, then, could any machine begin to understand the ways we love and hunger and hurt? The net contributions of science and art, history and philosophy, can't parse the full complexity of a human instant, let alone a life. For as long as this is true, we'll still have a role in writing.

We are humans, magnificent and inimitable. We love, we long, we bleed. No machine can begin to comprehend us. This is such blatant boilerplate it could have been written by a computer and maybe was. AI has advanced to the point where it can churn out the articles that introduce us to AI, each of these articles ending on a mawkish note that puts our minds at ease.

So I sympathize with Stephen Hawking's fear of what he sees coming down the road. Of course, Hawking's judgment is skewed. He never had to write an earnings report.

Comments (9)

Showing 1-9 of 9

Add a comment
 

Add a comment