To understand this note one must first understand concept of free energy as described in the note information and the origins of entropy, which would have been more appropriately called information and the origins of complexity.
The relationship of free energy Ḟ = (∑Ḟi)2/ E is just that, it helps us understand that the more complex an information package becomes the freer energy is reduced as a proportion of the total energy, the more potential energy for doing work as a proportion of total energy is reduced.
However we have tools that Einstein did not have, we have computers and we can thus experiment with simulations.
Not sure it will work, but we try because we are human
There is a computer, it has no program it is blank. It is switched on, when on it consumes ˄o amount of current. 1 byte of information is programmed into the computer, the computer then consumes ˄1 amount of current.
˄1 - ˄0 = ▲1
Clearly ▲1 is the amount of current required to sustain that 1 byte of information.
The second step is that 1 byte of information is programmed to move from point A to point B on the screen. When this occurs the computer consumes ˄2 amount of current.
˄2 - ˄1 = ∆1
∆1 is the free energy and ∆1/▲1 is proportion of free energy to total energy
Then the experiment is repeated with 2 bytes of information, then 3 bytes up to at least 100 bytes of information.
What we should find is that the proportion of free energy should be declining though total energy of the system is increasing.
The computer is best simulation that we have at the moment. If this works, then it would even be possible with thought to tackle E = MC2.