atrettel
Generative AI is just the new "bottom" in terms of quality. All that you have to do to compete against it is to be a little better than it. The question to me really is whether the quality of this new "bottom" is adequate. Sometimes it is for some people and for some applications, and sometimes it is not.

I do not use it myself because I am a researcher and I often ask questions that don't have a lot of "training data" yet. And even if an area is well covered in terms of "training data", often there is a lot of "know how" that really isn't written down in an easily digestible form. It is passed verbally or through examples in person. So the idea that the "training data" is complete is also not true in general.

Many other people in this thread have already covered that books are much more structured and organized than any answer generative AI gives you. Let me discuss another reason why books still matter. Books can give you a wider view than the "consensus" that something like ChatGPT gives you. I know a lot of books in my field that derive results in different ways, and I often find value in these different approaches. Moreover, suppose that only one book answers the question that you want answered but others gloss over that subject. Generative AI likely will not know precisely what one random book said on the subject, but if you were searching through multiple books on the subject yourself, you likely would pick up on this difference.

Relevant Paul Graham quote [1]:

> We can't all use AI. Someone has to generate the training data.

[1] https://x.com/paulg/status/1635672262903750662

codingdave
LLMs do not generate new content, they just shuffle old content together in new ways. So no, it does not kill an industry of people creating new original content. And authors only need to worry about it if they were not adding anything new to the world to begin with, and were instead relying on marketing to sell re-packaged existing content.
al_borland
Books exist for people who want in depth information with the full context, in an organized manner.

Short forms have always been available, it be blog posts, Wikipedia articles, cliff notes, or other such things. Books survive, because source material is needed to generate all of those other things, and those short form versions don’t cut it for everyone. I don’t see LLMs as any different.

A book can tell you something you didn’t know. With an LLM you need to know enough to ask.

galfarragem
I would rephrase it as: AI is shrinking the market for average human "creatives". Unless you are an outlier, adapt or perish.
latexr
Authors compete by being competent, doing research, and outputting factual information. Or just, you know, being original. In a world where LLMs can’t even differentiate between a recipe and an old Reddit joke and tell you to put glue on pizza, it is absurd to think they “killed the book industry”.

What’s with this bloody obsession of killing other products and industries? Every time someone farts in tech, everyone starts shouting that it just killed something else. Calm down. Relax a little bit and get some perspective. You’re drowning yourself in the Kool-Aid.

LLMs did not kill the book industry, just like Bitcoin did not kill the world’s financial system.

Mehticulous
The smell of paper, ink and binding glue.

The feel of quality paper.

The way the spine cracks when you first open a book.

The way the spine creases after you've read a book a few times.

nonrandomstring
I think you might be confusing different activities that look similar. Search, research, exploratory reading, browsing, fact checking, cross referencing, debunking, genealogy, making etymological and epistemological connections are all different things. As an author and researcher I produce and consume a lot more types of connections and paths than a simple neural net that can make fast associations on past training material can offer. YMMV.