# Leisure Readings

Idle thoughts, problems, notes, among others.

## Biochemistry

- Berg, Jeremy M., John L. Tymoczko, Gregory J. Gatto, and Lubert Stryer.
*Biochemistry*. 8th ed. W. H. Freeman & Co., 2015. ISBN: 9781464126109. - Shriver, Atkins.
*Inorganic chemistry*. 5th ed. 2010.

following the MIT undergraduate general biochemistry course. The long-term goal is to understand endosymbiosis in detail. Possibly need some organic chemistry for molecular biology of the gene.

I'd like to know whether there is a book that explicates the differences between euryotes and prokaryotes from the perspective of (endo-)symbiogenesis.

### Quick facts about pH

Tip: `\ce{H3O+}`

for $\ce{H3O+}$ and so on; More.

I have nearly forgotten all the high school chemistry stuff.

The pH value is defined as the negative of the 10-base log of the concentration of $\ce{H3O+}$, equivalently regarded as $\ce{H+}$, in water,
$$
pH := - \log_{10}([\ce{H+}])
$$
Of course this depends on the unit. The unit is that of molar concertration $\operatorname{M} = \operatorname{mol}\cdot \operatorname{L}^{-1}$. In most cases $[\ce{H+}][\ce{OH-}]\propto [\ce{H2O}]$ is constant, approx $1.0\times 10^{-14}$. Since the case when $[\ce{H+}]$ exceeds $1\operatorname{M}$ or is lower than $10^{-14}\operatorname{M}$ is rare, the pH is **said to** range from 0 to 14, but this is not always the case. In particular, when $[\ce{H+}] = [\ce{OH-}]$, the pH is 7.

## Cryptography

~~Katz, Lindell - Introduction to Modern Cryptography~~- Bellare, Rogaway - Introduction to Modern Cryptography
- Silverman, Tate - Rational Points on Elliptic Curves
- Washington - Elliptic Curves, Number Theory and Cryptography

Elliptic curve cryptography for fun, and, why not, at this mignificent dawn of the cyber-surveillence dystopia. The subculture is also worth attention,

Best served with some autism:

## Thermodynamics and Contact Geometry

Reading biochemistry really made me get into thermodynamics, a subject that I wasn't interested in at all in my undergraduate years. I know some little facts - without knowing how - about the contact geometry of thermodynamics. The 1-form $$ \alpha= dU - TdS + PdV $$ vanishes on a Legendrian submanifold, and Legendre transform is a simple example of contact transformation. Regarding the expressions such as $dU$ as differentials, then the Maxwell relations are just saying that $d^2=0$.

Need to find some helpful sources. Also I've been planning to learn about how optimization problems are turned into index theory and Morse theory, Floer homology etc. for some time, this seems to be a good point of departure. John Baez wrote some elementary stuff, but that's too simple. He also has a project called information geometry but again superficial to my taste.

Some idle thoughts. What's the relation between optimization problems and index theory? All indices are just $K$-theory pushforwards, of something (a groupoid, a groupoid action, etc.), and we know that $K$-theory pushforward of some Lie group $G$-action on a manifold $M$ gives the quantization of $G$ (roughly, its representation, and if the action is given by a moment map, then this gives the quantization of the observable - its momentum). This is done through Atiyah-Segal *fixed point* decomposition. Morse theory is also some sort of index theory. It utilizes the informations of *critical points* to probe the topology of a manifold. The problem with $K$-theory is that it is too abstract and doesn't give a good intuition on what's going on "on the physics side". I can't help to speculate that contact topology will give some insight into the nature of $K$-theory and its index, and even quantization.

An important question that needs to be contemplated on over and over again:

What is the *real* difference between quantum and classical?

Some notes are moved to a separate post.

## Information theory

- Li, Vitanyi - An Introduction to Kolmogorov Complexity and its Applications. 1997 ed. (now switched to 2019 ed.)
- Yury Polyanskiy, Yihong Wu - Lecture Notes on Information Theory,
- Neri Merhav - Information Theory and Statistical Physics - Lecture Notes URL

for information theory proper following MIT graduate information theory. Occasionally consulting Computability and Complexity Theory. By the way, Polyanskiy's lecture note is fantastic.

For better insights in emergence and complex systems. The nature of logical inference, information, etc.

#### 2023-02-28 (semi-problem)

Not sure whether I understand it correctly but it seems that in p.99-100 *Li and Vitan, 97*, it is claimed that the part of the Kolmogorov complexity of an object $x$ that corresponding to the encoding of a Turing machine $T$ in a universal turing machine $T_U$ is the regular, "valuable" one, and what is really intrinsic to the object $x$ is the random or "useless" information, since the ordering of the Kolmogorov complexities is indenpendent of the choice of the encoding of Turing machines by a universal Turing machine. Hence, in
$$
C(x) = \operatorname{min}\big\{l(T) + C(x|T): T\in\{T_0,T_1,...\}\big\} + O(1),
$$
the regular part corresponds to the description of the Turing machine, $l(T)$, which is dependent on how it is represented, and $C(x|T)$ is the "random" part.

Given a universal computable function $\psi_U:\mathbb{N}^2\to\mathbb{N}$, this amounts to saying that what is really important and "regular" is the function $s(x)$ in $$ \psi_U(x,n) = \phi_{s(x)}(n) $$ where $\phi_k(n)$ is the Turing machine with Godel number $k$, an the Godel numbering that is in use. Occasionally $s$ is chosen as the identity function. Hence if the numbering is chosen s.t. for a particular Turing machine, its Godel number $k$ has a very small description length, then the complexity of regular part can be regarded as $0$, the choice corresponds to a shift of the constant.

#### 2023-03-04 Dovetailing; Inductive Reasoning

In *Li,Vitanyi, 97*, p.192 really badly explained. Dovetailing here is simply "In $i$-th stage there are $i$ number of machines running (computing $\phi(pq_1),\phi(pq_2),...,\phi(pq_i)$ respectively), each step in the stage corresponding to a step of computation of each of the machines." In the text the binary sequences $\\{0,1\\}^\ast$ is given an arbitrary order (meaning $q_0,q_1,q_2,...$ can be any sequences, not necessarily $0,1,10,...$), so the algorithm given might seem suspiciously inefficient.

In the fifth chapter *ibid.* the Occam's Rzaor Principle seems similar to the principle of sufficient reason. The latter can be used to stipulate postulates such as Hamilton's principle (Leibniz used it to justify or rather explain Fermat's principle, see van Fraassen, Laws and Symmetry, ~~possibly the first two chapters~~ 1999, p.12) and the principle of maximal entropy. There are certainly some deep connections underlying complexity, informational and thermodynamic entropy, path integral formulation of QM and such.