QRI is new, but many of our core tools can be traced back to existing research lineages. Some highlights:

**David Marr** is most famous for Marr’s Three Levels (along with Tomaso Poggio), which describes ”the three levels at which any machine carrying out an information-processing task must be understood:”

*Computational theory*: What is the goal of the computation, why is it appropriate, and what is the logic of the strategy by which it can be carried out?*Representation and algorithm*: How can this computational theory be implemented? In particular, what is the representation for the input and output, and what is the algorithm for the transformation?*Hardware implementation*: How can the representation and algorithm be realized physically? [Marr (1982), p. 25]

This framework *sounds* simple, but is remarkably important since arguably most of the confusion in neuroscience (and phenomenology research) comes from starting a sentence on one Marr-Poggio level and finishing it on another, and this framework lets people debug that confusion.

**Selen Atasoy** has pioneered a new method for interpreting neuroimaging which (unlike conventional approaches) may plausibly measure things directly relevant to phenomenology. Essentially, it’s a method for combining fMRI/DTI/MRI to calculate a brain’s intrinsic ‘eigenvalues’, or the neural frequencies which naturally resonate in a given brain, as well as the way the brain is currently distributing energy (periodic neural activity) between these eigenvalues. Furthermore, it seems *a priori* plausible that measuring these natural resonances could be a powerful technique for understanding the brain, since (1) they follow nicely predictable mathematical laws, and (2) a system with such harmonics will likely self-organize around them, and thus have a hidden predictability or elegance.

**Giulio Tononi** offered the first “full-stack” paradigm for formalizing consciousness with Integrated Information Theory (IIT). It’s notable as a viable empirical theory of consciousness in its own right, a clear enunciation of what the ‘proper goal’ of a theory of consciousness should be (determining a mathematical object isomorphic to the phenomenology of a system), and also as a collection of clever tools for approaching the many sub-problems of consciousness. Tononi’s lineage traces back to Nobel laureate Gerald Edelman, a distinction shared by many stars of modern neuroscience such as Karl Friston, Olaf Sporns, and Anil K. Seth.

**David Pearce** has tirelessly advocated for *valence realism*, or the idea that some experiences *really do feel better* than others, and that this can bridge a key part of the is-ought distinction. Further reading: Can Biotechnology Abolish Suffering?

**Emmy Noether **provided the theoretical basis for formalizing invariants in physical systems through Noether’s theorem: ‘every symmetry in a system’s equations corresponds to a conserved quantity in that system (and vice-versa).’ This formed the seed for modern gauge theory, the mathematical basis for modeling conservation laws for energy, mass, momentum, and electric charge. We believe Noether’s work provides both concrete tools and aesthetic guidance for how we might formalize certain aspects of phenomenology. Read more.