![]() I should write a blog post about my recollections of our interactions, but there is already a write-up in the book memorializing Hans's life. For me, the quote that fills me with awe comes from another genius who I've had the priviledge to know well, the physicist Hans Bethe. It is difficult to appreciate the genius that was John von Neumann, not the least because there aren't many people who are as broadly trained as he was. The one to blame for this confluence is none other than John von Neumann, the mathematician, physicist, engineer, computer scientist (perhaps Artificial Life researcher, sometimes moonlighting as an economist). Fortunately, you can read it on arxiv here, and I urge you to because it does talk about Gerry in the introduction.īefore we get to the relationship between Shannon's entropy and Boltzmann's, how did they end up being called by the same name? After all, one is a concept within the realm of physics, the other from electrical engineering. ![]() But face it: a book chapter doesn't get a lot of readership. Perhaps a blog post on the relationship between thermodynamics and information theory is appropriate, as it bridges a subject Gerry taught often (Thermodynamics) with a subject I have come to love: the concept of information. He passed away a year ago to this day, and I have not yet found a way to remember him properly. co-advisor and a strong influence on my scientific career. I did write about some of these issues in an article that was published in a Festschrift on the occasion of the 85th birthday of Gerry Brown, who was my Ph.D. I suspect that what follows below isn't very original (otherwise I probably should have written it up in a paper), but I have to admit that I didn't really check. This is the blog post (possibly a series) where I try to throw some light on that relationship. Why do these seemingly disparate concepts have the same name? How are they related? And what does this tell us about the second law of thermodynamics? The concept was originally discussed by Clausius, but because he did not give a formula, I will just have to ignore him here. You have read my introduction to information theory I suppose (and if not, go ahead and start here, right away!) But in my explanations of Shannon's entropy concept, I only obliquely referred to another "entropy": that which came before Shannon: the thermodynamic entropy concept of Boltzmann and Gibbs. The word "entropy" is used a lot, isn't it? OK, not in your average conversation, but it is a staple of conversations between some scientists, but certainly all nerds and geeks. Note: this post was slated to appear on May 31, 2014, but events outside of my control (such as grant submission deadlines, and parties at my house) delayed its issuance.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |