Piero Scaruffi(Copyright © 2013 Piero Scaruffi | Legal restrictions )
These are excerpts and elaborations from my book "The Nature of Consciousness"
Language is a set of sentences. Each sentence is a finite string of words from a lexicon. And a grammar is the set of rules that determines whether a sentence belongs to that grammar’s language. Moreover, the rules of the grammar are capable of generating (by recursive application) all the valid sentences in that language: the language is “recursively numerable”.
When analyzing a sentence (or “parsing” it), the sequence of rules applied to the sentence builds up a “parse tree”. This type of grammar, so called “phrase-structure grammar”, turns out to be equivalent to a Turing machine, and therefore lends itself to direct implementation on the computer.
The phrase-structure approach to language is based on “immediate constituent analysis”: a phrase structure is defined by the constituents of the sentence (noun phrase, verb phrase, etc).
Initially, Chomsky thought that a grammar needs to have a tripartite structure: a sequence of rules to generate phrase structure, a sequence of morpho-phonemic rules to convert strings of morphemes into strings of phonemes, and a sequence of transformational rules that transform strings with phrase structure into new strings to which the morpho-phonemic rules can apply.
Whatever the set of rules, the point was that analyzing language was transformed into a mechanical process of generating more and more formal strings, just like when trying to prove a mathematical theorem. The underlying principle was that all the sentences of the language (which are potentially infinite) could be generated by a finite (and relatively small) number of rules, through the recursive application of such rules. And this fit perfectly well with the Logic-based approach of Artificial Intelligence to simulating the mind.
Back to the beginning of the chapter "Language: Minds Speak" | Back to the index of all chapters