Traditional Publishing
Self-Publishing
Share

Why AI can’t take over creative writing

theconversation.com – Wednesday April 2, 2025

In 1948, the founder of information theory, Claude Shannon, proposed modelling language in terms of the probability of the next word in a sentence given the previous words. These types of probabilistic language models were largely derided, most famously by linguist Noam Chomsky: “The notion of ‘probability of a sentence’ is an entirely useless one.”

In 2022, 74 years after Shannon’s proposal, ChatGPT appeared, which caught the attention of the public, with some even suggesting it was a gateway to super-human intelligence. Going from Shannon’s proposal to ChatGPT took so long because the amount of data and computing time used was unimaginable even a few years before.

ChatGPT is a large language model (LLM) learned from a huge corpus of text from the internet. It predicts the probability of the next word given the context: a prompt and the previously generated words.

ChatGPT uses this model to generate language by choosing the next word according to the probabilistic prediction. Think about drawing words from a hat, where the words predicted to have a higher probability have more copies in the hat. ChatGPT produces text that seems intelligent.

There is a lot of controversy about how these tools can help or hinder learning and practising creative writing. As a professor of computer science who has authored hundreds of works on artificial intelligence (AI), including AI textbooks that cover the social impact of large language models, I think understanding how the models work can help writers and educators consider the limitations and potential uses of AI for what might be called “creative” writing.

To read the full article on theconversation.com, click here

Share