Natural language affords humans with the ability to construct an unbounded array of hierarchically structured expressions. The syntactic component involves forming binary-branching sets via the operation MERGE, taking objects from the lexicon or objects already part of the syntactic workspace. These structures then regulate linguistic meaning, which amounts to forms of conceptual instructions. A major strand within the generative enterprise further assumes that assembling linguistic structures does not involve reference to extra-mental entities (i.e., there is no ‘word-world’ relation). This type of semantic internalism is defended here using a broad range of case studies. The unifying theme throughout is that linguistic structure and meaning are wholly mind-internal processes exhibiting a specific computational and representational architecture. By attempting to inject models of syntax and semantics with concerns relating to cognitive constraints, I review how recent efforts in this direction may be able to unveil a means to reconcile properties of human language with endogenous properties of the brain.
© 2001-2025 Fundación Dialnet · Todos los derechos reservados