19 Comments
User's avatar
Alice Wanderland's avatar

Love this post! You know what’s even more to your point? Now that we have these giant deep-learning trained LLMs, which are still being implemented on computers with well-understood transistors, we don’t understand what algorithms are being implemented within them!, and it’s *still* hard to reverse engineer out the implement algorithms.

The mechanistic interpretability people are working on one or two layer transformers, and the latest large scale attempt was on figuring out the exact algorithm LLMs that are roughly the size of gpt-2 use to do arithmetic (https://arxiv.org/html/2502.00873). But that’s a far cry from understanding LLMs on the order of GPT-4/claude3.

Neuroscience is hard *even when* you have read AND write access to every literal variable and input/output you could want, and no ethical limitations on cutting up the model! (Imagine being able to make a biological neuron fire at exactly 5X its regular rate, or cut exactly and only half of its axon outputs.)

Expand full comment
Tommy Blanchard's avatar

Thank you! I agree, LLMs are a really interesting example of just how hard it is to understand big structures with this kind of organization. There's a lot of research on LLMs that starts to look like cognitive neuroscience, which is super cool imo

Expand full comment
Mike Smith's avatar

A sober caution. Science is hard and this is going to take a while.

I'm curious if there are any promising new technologies on the horizon that might help with the precision, either temporal or spatial.

Expand full comment
Tommy Blanchard's avatar

Bigger, more powerful magnets give better MRI spatial resolution, which has seen some significant advances recently, especially in neuroanatomy (rather than neural activity), e.g. https://www.universityofcalifornia.edu/news/breakthrough-brain-imaging

While electrodes might sound crude, there are continual advances with them, allowing arrays with more electrodes capturing more neurons with fewer issues with rejection: https://www.massdevice.com/precision-neuroscience-brain-electrode-record-bci/

In animals, genetic manipulation has opened up a lot of doors, like optogenetics (allowing you to manipulate neural activity with light: https://en.wikipedia.org/wiki/Optogenetics). I haven't kept up with it, but I suspect with advancing genetic tools, more doors will be opened--like perhaps optical imaging in more model organisms, or other totally new tools.

Expand full comment
Mike Smith's avatar

This is fascinating. Thank you!

Expand full comment
Naveen Rao's avatar

Another awesome article. Thanks!

I'm curious for your perspective on recent advances in commercial neuroimaging tech, e.g.

connectomics systems for surgical assistance like o8t.com

Openwater's ultrasound based system that is now in pre-sale

Kernel's td-FNIRS wearable

the irony to me is that the comparison for anything new is the "clinical standard" of legacy rater survey tools that are 40-50 years old and incredibly subjective and variable.

Expand full comment
Tommy Blanchard's avatar

I'm not familiar with the particular manufacturers, so can't really give an opinion on that. But in terms of techniques:

In general, it seems like connectome work has come a long way in the last decade or so, both in humans and non-humans. Which is great! But even having a full connectome isn't going to tell us everything about a brain--it's an important piece of the puzzle, but far from the whole thing.

FNIRs is great as a lightweight alternative to fMRI or EEG, and has its place in research, but has similar drawbacks to both of them--measures a proxy for brain activity, has movement artifacts, and poor spatial resolution. One nice use I've seen for them is in infants/young children where fMRI and EEG are less practical because the subject is... less cooperative.

All of these are great and show progress on our measures of neural activity, but seem like incremental advances rather than the breakthrough going to let us crack the thing wide open IMO

Expand full comment
Susy Churchill's avatar

Great explanations - but please don't call autism a brain disorder. Neurodivergent people have hugely varied skills and abilities.

Expand full comment
Tommy Blanchard's avatar

What language would you prefer? Calling it a disorder isn't inconsistent with the statement that neurodivergent people have hugely varied skills and abilities. The NIH, CDC, and autism advocacy groups like Autism Speaks, all call it a disorder. The term "Autism Spectrum Disorder" is extremely common.

Expand full comment
Frank's avatar

Your comparison with diagnosis of the operation of a digital circuit is interesting but I wonder whether a better analogy would be electron beam testing of semiconductor circuits? See for example

Ihttps://onlinelibrary.wiley.com/doi/epdf/10.1002/sca.4950050103

I have no recent knowledge of this topic but in the case of circuit testing we the investigation would have access to sophisticated modeling of the analogue switching behaviour and large scale behavioural models of the digital operation.

Expand full comment
Cool Librarian's avatar

Great article, but discouraging for someone who is interested in pursuing psychology and wants to figure out how her brain works. What is your advice for not getting discouraged enough to give up this inquiry?

Expand full comment
Tommy Blanchard's avatar

The brain is the most important thing to understand that I can imagine. Understanding it is about understanding ourselves, the kinds of creatures we are, and the mechanisms that make us up.

Our picture of the brain is fuzzy, and it takes a lot of work to clarify it. But that doesn't mean we don't already have a rich (fuzzy) picture now that is worth learning about because of how much it tells us about ourselves. We might not understand things at the same level as a computer, but we have an understanding of a lot of the different systems at work.

The difficulty of studying the brain also means there are lots of areas to make original contributions to, if original research is of interest.

Hopefully that's helpful?

Expand full comment
Cool Librarian's avatar

Thanks, but I asked for advice in keeping yourself emotionally and mentally balanced when working in a field that is theoretically and practically uncertain.

Expand full comment
Susannah Mary Leopold's avatar

This might help: I come from the humanities and worked as a translator for a decade, where technology and new developments are, in general, seen as a threat. For me, it was a revelation to see how scientists view uncertainty as potential - not knowing the answers and knowing that there is still a long way to go is exciting. Instead of seeing a threat, they see questions to answer and knowledge to discover. That's something that I've learnt to love too.

Expand full comment
Cool Librarian's avatar

But how do we know we are not leading ourselves astray in the process? Do we always have keep vigilance over our cognitive biases that force us to practice intellectual humility? And doesn't science pose more questions than answers anyway?

Expand full comment
Susannah Mary Leopold's avatar

There are answers and then there are new questions and new answers to find :) I think being wary of unreliable sources is more important than cognitive biases, although it's good to keep both in mind.

Expand full comment
Cool Librarian's avatar

Why do think being wary of unreliable sources is more important that being aware of cognitive biases?

Expand full comment
Cool Librarian's avatar

Thank you for answering my questions ☺️

Expand full comment