All Episodes

October 26, 2025 19 mins

The provided text is an academic paper titled **"Active Use of Latent Constituency Representation in both Humans and Large Language Models,"** which explores how sentences are internally represented in both the human brain and large language models (**LLMs**) like ChatGPT. The authors introduce a novel **one-shot learning word deletion task** where participants infer a deletion rule from a single example; they found that both humans and LLMs tend to delete a **complete linguistic constituent** rather than a nonconstituent word string, suggesting that latent, hierarchical linguistic structures emerge in both. Furthermore, the study demonstrates that the **deletion behavior** can be used to reconstruct a **constituency tree representation** that is structurally consistent with linguistically defined trees. The research also investigates how **language-dependent rules** are inferred and finds that native speakers primarily rely on **syntactic structure** over semantic plausibility in this task.


Source:

https://arxiv.org/pdf/2405.18241

Mark as Played

Advertise With Us

Popular Podcasts

Spooky Podcasts from iHeartRadio
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.