Follow
Takateru Yamakoshi
Takateru Yamakoshi
Verified email at g.ecc.u-tokyo.ac.jp - Homepage
Title
Cited by
Cited by
Year
Reconstructing the cascade of language processing in the brain using the internal computations of a transformer-based language model
S Kumar, TR Sumers, T Yamakoshi, A Goldstein, U Hasson, KA Norman, ...
BioRxiv, 2022.06. 08.495348, 2022
362022
Investigating representations of verb bias in neural language models
RD Hawkins, T Yamakoshi, TL Griffiths, AE Goldberg
arXiv preprint arXiv:2010.02375, 2020
222020
Probing BERT's priors with serial reproduction chains
T Yamakoshi, TL Griffiths, RD Hawkins
arXiv preprint arXiv:2202.12226, 2022
122022
Causal interventions expose implicit situation models for commonsense language understanding
T Yamakoshi, JL McClelland, AE Goldberg, RD Hawkins
arXiv preprint arXiv:2306.03882, 2023
42023
Shared functional specialization in transformer-based language models and the human brain
S Kumar, TR Sumers, T Yamakoshi, A Goldstein, U Hasson, KA Norman, ...
2*
Neural Constructions
T Yamakoshi, R Hawkins
OSF, 2020
2020
Reconstructing the cascade of language processing in the brain using the internal computations of transformer language models
S Kumar, TR Sumers, T Yamakoshi, A Goldstein, U Hasson, KA Norman, ...
The system can't perform the operation now. Try again later.
Articles 1–7