Institute for Categorical Cybernetics
Governance and control for the age of AI
Our mission is to develop theory and software for governing systems that learn and make decisions, for the benefit of their users and of humanity.
Latest posts

Iteration with Optics
In this post I'll describe the theory of how to add iteration to categories of optics. Iteration is required for almost all applications of categorical cybernetics beyond game theory, and is something we've been handling only semiformally for some time. The only tool we need is already one we have inside the categorical cybernetics framework: parametrisation weighted by a lax monoidal functor. I'll end with a conjecture that this is an instance of a general procedure to force states in a symmetric monoidal category. 
Passive Inference is Compositional, Active Inference is Emergent
This post is a writeup of a talk I gave at the Causal Cognition in Humans and Machines workshop in Oxford, about some work in progress I have with Toby Smithe. To a large extent this is my take on the theoretical work in Toby's PhD thesis, with the emphasis shifted from category theory and neuroscience to numerical computation and AI. In the last section I will outline my proposal for how to build AGI. 
How to Stay Locally Safe in a Global World
Suppose your name is x and you have a very important state machine that you cherish with all your heart. Because you love this state machine so much, you don't want it to malfunction and you have a subset which you consider to be safe. If your state machine ever leaves this safe space you are in big trouble so you ask the following question. 
AI Safety Meets Value Chain Integrity
Advanced AI making economic decisions in supply chains and markets creates poorlyunderstood risks, especially by undermining the fundamental concept of individuality of agents. We propose to research these risks by building and simulating models. 
About the CyberCat Institute blog
This is a short summary of the post. It is meant to explain how to write for our blog.