entropy

Causal Slippery Fish

The Beginning

This was supposed to be my birthday blog post on entropy.
Tried to perfect this blog draft for eighteen extra days.
Much of this delay was caused by generating alternate timelines.

In a parallel multiverse timeline, fate invented generally artificial intelligence sooner and WordPress is now the name of a talented inventor bot.

Inspired by a famous AI blogger whose last name rhymed with a semicolon; 

Wordpress already enabled AI plugins for creating interstellar non linear narratives, only somewhat shy of tenets.
Magicians Choice

Twenty percent, or one-fifth of this blog post, is about my rant on why it took too long to write exactly 1111 words here. –> Read 20 % First

* You have choosen to skip past the whole rant.
* My birthday blog post is only nine days late.
* Rant above is at least hundred words shorter.

. You exist as bits of vivid meaningless memory
.. Your simulated compute stream is browsing ChatGPH
... Your H ++ pointer halted on an interesting computation

Eighty percent, or four-fifth of this blog post, is about borderline interesting things somewhat related to our topic. –> Read 80 % First

Twenty percent just got over.

Time is a slippery fish in the realm of quantum mechanics, where it is treated as a continuous variable. It can take any real value within a given range, a continuous timeline.

However, there is a limit to how precisely our timelines can be measured: the Heisenberg uncertainty principle. More precisely, the time of an event is known, less precisely, its energy can be known, and vice versa.

Some scientists believe this could mean that time is not actually a continuum, but rather, it is composed of discrete “quantum” units: tiny atomic clocks.

Like Grains of Sand

More hidden states a system can take, the higher its entropy and disorder. But that is not all. Entropy is the capacity to remember. Entropy enables memory of time itself!

The first nine days of delay in writing this were caused by my failed attempts to recreate a Bandersnatch like effect on this very WordPress blog post, so you, my dear reader, could get some extra free choice to click past the atrocities of this entropy rant.

For millenials without Neflix accounts who can still read long form text, Bandersnatch is basically like Goosebumbs – Choose Your Own Scare, but in an internet television format labelled appropriately as OTT. FYI only.

On a Cosmic Beach

Oh dear, the second nine days of delay(s) in writing this blog post were simply outrageous! You see dear reader, we had infinite blog titles to pick from, all generated by the smart and savvy ChatGPT. I was lost in a heap of titles, wandering aimlessly like an anxious mouse in a maze. A delicate balancing act, trying to tame beastly entropy, while still letting it roam free, like the wild stallion in town.

Then to make matters worse they forgot to train ChatGPT to rant.

ChatGPT is no good at rants unless “as an AI language model”.

Now here we are, chatting away about mysteries of the universe, the wonders of entropy, whiling away time.

Thanks for trying my row bot friend.

Lub-dup. Dub-lub. Lub-dub.

Anxiety is Ticking

The next day we are staring at the screen once again wondering why these words won’t assemble themselves in this very exact sequence. Let’s try and comprehend this arrythmic timeless dance battle between procrastination and creativity.

At the smallest levels, everything whole is made up of tiny invisible parts that can be in different positions and configurations. We call these parts “microstates”. We cannot ever measure or observe these!

But when we look at anything as the whole, we can measure its observable properties like temperature, pressure, and energy. We call these measurable properties “macrostates”.

Bits in Pieces

The Boltzmann definition of entropy is a blessing in disguise.

Remarkably, the numbers we measure from the systemic whole (macrostates) is always related to the numbers we cannot ever measure in the tiny hidden parts (microstates).

The number of microstates that a macrostate can sustain represents the amount of disorder, random information, and memory of time a system can survive. And we call this beautiful property “entropy“.

Procrastination versus Creativity

Entropy of a closed system can increase over time, but it can never ever decrease. This is known as second law of thermodynamics: primary reason for the direction of time’s arrow.

S=k_{b}\ln\Omega
S=Entropy of Macrostate
k_{b}=Boltzmann Constant
\ln=Natural logarithm
\Omega=Number of Microstates possible

Entropy of a system increases with the count of hidden microstates bringing growth in disorder, complexity, and bits. More on that later.

Today we will now meet and greet enrotpy’s best friend, the inverted cousin of the power structure, the natural lawg!

Borderline traits log

When I first learnt about Log functions I was unimpressed. “Exponential Invertor” would have made for a much more saleable math product.

Other functions carry more interesting stories. Trignometric functions are better to joke with. Exponential functions get over hyped for hypergrowth. Even complex ones are usually imaginative and real.

But Logs never make it to top three favorite functions list for anybody. There is a function to define averages and its mean. Even that function looks more interesting than a log. Could say logarithm is below average but now we’re just being mean to arithmetic mean.

Tiny Tree Talks

Log functions plotted on paper skins of dead trees magically chart the shape of growth of those very trees while they were still alive.

Noone takes particular inspiration from the life cycle of dead trees. Noone cares about slow growth unless it turns exponential later on.


logs and exponents on a date

The appearance of natural logs in almost all equations of entropy is not a coincidence. Log functions represent the gradual decline of exponential power growth. They describe the shadows of slow decay. Dissipation of energy over time.

all is well that ends well

One could also say my writing speed was logarithmic over the last month but my learnings about power laws were exponential. In the next few parts of this blog series, we will talk about more about why dissipation of energy over time is proportional to entropy always. Useful heat.

As I bid farewell for today, I want to express my gratitude to those who have accompanied me on this journey, of how I turned almost ten times the nautral logarithm of my age rounded off to the nearest tens.

Thank you for reading Entropy Chronicles! A 9 part essay on Entropy and Everything delivered to you in 11 parts.

All is well that ends well.
See you later, alligator.
After a while, crocodile.
So long, and thanks.
For all the Phish.
Remember Always.

TS
09.02.2023
Fin.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.