Some of these connections where what drew may attention to find that there is morphisms happening across different control constructs.
Chris Okasaki has found the link between Lazy Evaluation / Call by Need and Continuations.
Closures — Actor isomorphism was demonstrated by Guy Steele/Dan Friedman but was rejected by Hewitt.
Natural language exhibiting continuations is described by Barker here and in this book.
Learnt that combinational logic is one level lower to regular languages in that, they don’t need memory. There is a finite set of combinations rather than a set of rules that have to recall what came before. I think this difference is apparent only at the hardware level. I need to poke deeper to understand if there’s a mathematically tractable divide between these two sections: Combinational Logic and Regular languages. The similarity I feel is that both have a set of preset rules which they carry in memory.
There seems to be a link with sheafs, pre-sheaves, and computation. I think I need to make my understanding of how topology and algebra is related to ground exactly how these relationship is structured. I need to start collecting resources towards this end here. Some details can be found here: https://en.wikipedia.org/wiki/Computable_topology
Learnt about Böhm trees. Need to understand their relationship to topology of Lambda Calculus.
Noson Yanofsky seems to be treading similar space: http://www.sci.brooklyn.cuny.edu/~noson/MCbeginning.pdf There are a lot of neat diagrams here that shows how different ideas are related in a Pascal-esque espirite de geométrie.