Unsupervised context sensitive language acquisition from large, untagged corpora

Zach Solan*, Eytan Ruppin, David Horn, Shimon Edelman

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

An alternative hypothesis, according to which syntax is an abstraction that emerges from exposure to language, coexisting with the corpus data within the same representational mechanism is discussed. The Automatic Distillation of Structure (ADIOS) model is used which has two components which include a representational data structure, which is a directed multi graph, and a pattern acquisition (PA) algorithm that refines the graph in an unsupervised fashion. The ADIOS model incrementally learns the syntax of English from raw input by distilling structural regularities from the accrued statistical cooccurence and contextual cues. The patterns learned by ADIOS are also more powerful than context free rewriting rules, because of their conservative nature.

Original languageEnglish
Pages61-64
Number of pages4
StatePublished - 2004
Event2004 AAAI Spring Symposium - Stanford, CA, United States
Duration: 22 Mar 200424 Mar 2004

Conference

Conference2004 AAAI Spring Symposium
Country/TerritoryUnited States
CityStanford, CA
Period22/03/0424/03/04

Fingerprint

Dive into the research topics of 'Unsupervised context sensitive language acquisition from large, untagged corpora'. Together they form a unique fingerprint.

Cite this