Recently, my publisher told me that a large technology company involved in the development of artificial intelligence was interested in using my book, “Stories from Montana’s Enduring Frontier,” “for AI training purposes.”
I would earn, the representative explained, $340 for “this one use.” Is it single use like a wet wipe – disposable, consumable, easily disposable?
“Stories from Montana’s Enduring Frontier” brought together 20 years of my essays to argue that 20th-century Montanans developed unique visions of how nature works, as captured in the wilderness adventure and resource extraction connotations of “the frontier.” The book seemed particularly foreign to anything related to the world of AI.
Every writer I know feels particularly vulnerable to AI. Most of today’s commercial AI programs are “large language models”, with skills not in logic, reasoning or mathematics, but simply in text generation. This directly threatens the jobs of writers.
Worse yet, replacing a human writer with today’s generative AI is like replacing a wild raspberry with an artificially flavored Crystal Light. Error-filled and uncreative AI products threaten not only writers, but also the joy and purpose of reading.
Support local journalism
While most people rant abstractly about AI, this question about buying my book presented a clear choice, sharpened by the specificity of “$340.” If I accepted the offer, would the knowledge I put into these essays become available to an AI, potentially decimating my book sales? If my royalties went to zero because I had signed a death warrant for a book that is like a child to me, would the $340 be worth it?
Maybe $340 was better than nothing. Many tech companies train AI models by stealing from the authors. “Stories from Montana’s Enduring Frontier” was one of four of my pirated books for the “LibGen” databasewhich was used to educate Anthropic and Meta’s AI programs. Although Anthropic recently settled a resulting lawsuit, Meta and others may yet escape any punishment.
What is the fair value of my book? While $340 isn’t much compensation for all the work I’ve done, neither is a royalty of $1.19 per book sold. If my main goal was adequate market remuneration for my writing, I probably shouldn’t have published a book in the first place. The book is now 12 years old. At current sales rates, it would take a few years to earn $340 in royalties.
When I talked about this dilemma with friends, I felt like neither of us knew how to think through the situation. Perhaps, as with previous technologies, making the book more widely available will boost sales – or perhaps not. Perhaps AI will hinder the ability of young people to engage in intellectual careers – or perhaps its dangers are exaggerated.
Maybe the AI will swallow all my production without fair compensation – we know Anthropic and Meta have already tried.
My publisher wouldn’t say which AI company made the offer, how they arrived at this take-it-or-leave-it price, or how they would use my book. Would I react differently if AI contributed to knowledge of the world rather than just helping students cheat?
As I thought about it, I realized that I was reflecting a distinctly human desire, rather than an AI desire. A large language model consumes a book as data. His model always requires more data to predict what the next word in a sentence should be.
It’s certainly deflating to my ego to consider the product of my research, extensive reading, interviewing, thinking, and finally writing as “data.” I would prefer it to be “knowledge” or even “wisdom” that the AI wants to suck out of me. I would prefer to think that he needs my well-told stories, my insightful ideas, my brilliant and more general arguments.
But AI doesn’t think in such general terms. It simply predicts one word, then another word, then another. I realized it was also a model of how nature works. There is no big project. No knowledge. No story with a satisfying ending. There is only one cell that reproduces. A leaf seeking sunlight. A predator looking for its next dinner.