Entropy and Language
So I’ve been reading this book on theoretical physics recently—well, these books, really—by Brian Greene called The Elegant Universe[1], which everyone can probably tell from the topics of my columns so far. I’ve also been reading a lot of experimental fiction like Borges, Kafka and David Markson (e.g. Wittgenstein’s Mistress), which is probably a good thing since that’s what my column primarily focuses on (experimental literature).
I keep coming up with all of these interesting (i.e. at least to me) correlations between physics and language.
A concept that occurs a lot in physics is entropy. A simple way to describe entropy is that it measures the uncertainty of a given outcome, or, rather, its randomness. Entropy is also present in language. Stand by for an explanation.
Gravity also comes up a lot as does time, and the two seem irrevocably meshed. For example, if an object has a lot of mass, and you are closer to it, time seems to pass more quickly—everything is packed is compacted into a given space, neatly and tightly. Think about the Russian cosmonaut, Sergei K. Krikalev, who has spent more time[2] in outer space than anyone else, he is actually, by a fraction of a second, younger than he would be if he were to have spent the same amount of time only on the earth.[3] Consequently, if an object has less mass, the slower time seems to pass, everything is more spread out.
Enter: entropy.
Ultimately words share a lot in common with physical matter and, by that token, entropy. Think of all of the words thus far in my column—they are compact and neatly organized, not very much entropy, not much room for randomness. Think of the letters like atoms and the words like molecules. The way you have read them until now has been easier, less entropy.
However, w h e n l e t t e r s a r e s p r e a d o u t (like that) it gets a little harder to read, takes more time, energy. There is more entropy. When they are spread out even more — l i k e t h i s — it gets even more complicated, ad infenitum.
If we go back to what I wrote about Borges’ Library of Babel (http://www.spectermagazine.com/tackling-borges), some of the volumes therein would consist of only one letter and a shitload of white space. It would be nearly impossible to read a novel one letter, one volume at a time. In this instance, there would, consequently, be a shitload of entropy. It’d be incredibly easy to get confused, randomness would be harder to account for and you’d likely need more time than a single human lifetime to get through War and Peace.
It’s even easy to consider genres in terms of entropy, i.e. poetry (usually) has more entropy than novels.[4] Poets (again, usually) use more line breaks and white space than novelists do, less concrete language, more metaphor and symbolism. There is typically allowed more room for randomness and chaos in poetry.
But as I mentioned, not always.
Take Darby Larson’s The Iguana Complex for instance. I would consider this work highly entropic and its genre nearly unclassifiable. The words feel alive and organic and many times, don’t make sense in a traditional, well, sense. It’s challenging and pushes the experiential boundaries of both writing and reading. J. A. Tyler is another fantastic example of this. His 25-page book The Zoo is a sort of poetry/ prose/ chap-book mash-up and has incredibly high entropy. And it’s really fucking good!
Another way to look at entropy is like a jigsaw puzzle. When it’s in 500 different pieces, there is a high level of entropy. When it’s complete, the entropy is low.
And ultimately, like matter that is made up of “star stuff,” poets and prose-ists all use the same basic building blocks: letters and punctuation.
As the universe expands toward the infinite, entropy increases, matter is ripped to atoms, atoms to particles, particles to even smaller yet-unnamed particles and so on. It’s like trying to read Infinite Jest letter by letter. Only, as all of the matter rips apart and its elements and particles get further and further away, gravity weakens to the point of near- (and then complete) non-existence. With less gravity, time ostensibly stops. Which is cool because you don’t age but sucks because you are a trillion-trillion-trillion little pieces.
A lot of what is included above is there because I’m a nerd and find it terribly fascinating. But I do have a point, even if it’s a bit protracted by now (even at only 777 words, this column probably feels twice as long). That point deals with the readability of experimental literature.
I talked about this in a book review I wrote for Lindsay Hunter’s Daddy’s, which is definitely experimental.
“Experimental literature” is kind of a nebulous term and ultimately a misnomer. Though it’s not a misnomer in the typical misnomery way people simply [mis]use terms out of context, but rather in a way that is almost insulting to those writers writing literature in a way that is considered “Experimental” by others—where calling the writing “Experimental” is ostensibly pejorative: “Man, this shit is weird! So it must just, like, be an experiment, right?” Weird and Experimental become synonyms. It (i.e. Experimental) becomes a term that gets applied when a work is different, difficult, or breaks with established conventions.
And as I mention in my review, that way of thinking is simply lazy.
Hunter’s prose is unconventional and it certainly decenters the reader from his or her preconceived notions of how to read a book. I won’t say too much more about it here but I definitely recommend checking it out. However, her book has less entropy (and is thus somewhat easier to read) than Darby Larson’s The Iguana Complex.
Just comparing these two, you can kind of see that readability falls on a continuum. On one end of the spectrum, you have the alphabet chart that circles most early-elementary school classrooms followed by See Spot Run and The Cat in the Hat; and at the other end you have Gravity’s Rainbow followed by Finnegan’s Wake.[5]
Experimental literature is an interesting area[6] because the best experimental prose is only as entropic as it needs to be, no more and no less. When people get overly crafty and make the sentences complex for the sake of being complex, it’s obvious to the reader and—at least to me—off-putting. Entropy is a good thing to keep in mind when writing. Even when writing something complex, it’s as important to make it only as complex as it needs to be, or risk putting off a portion of the audience.
After all, can anyone truly say just what the fuck actually happens in Finnegan’s Wake?
[1] As well as Greene’s The Fabric of the Cosmos and The Hidden Reality.
[2] Krikalev has spent 803 days, 9 hours and 39 minutes, or 2.2 years, in space.
[3] This has to do with relativity and time dilation.
[4] I submit I am absolutely generalizing here.
[5] I am not counting my infinitely-spaced Infinite Jest example which would fall further to the right, way out past Finnegan’s Wake.
[6] For lack of a better term.
Nice post and use of entropy to describe some great experimental literature. If you haven’t read them already, you might be interested in The Information by James Gleick and Decoding the Universe by Charles Seife. They both have excellent material on the quantification of information, the identical entropy equations for thermodynamics and information, and how physical laws are related to information. Gleick’s book is a bit more upbeat and talks a lot more about the history of information (categorization, dictionaries, Ada Lovelace et al) whereas Seife keeps pushing the point about the heat death of the universe and seems to delve more into the mathematics of everything, but both together are quite informative and somewhat mindblowing.
http://www.amazon.com/Information-History-Theory-Flood/dp/0375423729
http://www.amazon.com/Decoding-Universe-Information-Explaining-Everything/dp/067003441X
Holy cow! I’m salivating just thinking about those books!! I’ll definitely be checking them out.