4


4. How language works

This is about human language structure--my "linguistic levels" of description. (See "Stop!" example in Language video notes.)
These are my notes and topics of discussion in class.

Background on history of linguistics

(implications for relation between language and thought as well as the nature of human language)

Locke, Leibniz, Descartes

W. von Humboldt

(discovery of Indo-European_)

More on this later but it is an amazing story of reconstructing the "extinct" ancestral language from its living descedants.

Piaget, Vygotsky

Boas, Sapir, Whorf

DeSaussure, Chomsky

Not surprisingly these guys flock together according to how much import they give to language, especially the lexicon of a language, in thought. Locke, some of von Humboldt & Vygotsky, Boas, Sapir, and Whorf are on the side of "linguistic relativity" while Leibniz, Descartes, some of von Humboldt, de Saussure, Piaget, and Chomsky tend to emphasize language structures beyond words and symbolic structures beyond language. (Of course there are major differences among the individuals, too. Piaget for example never accepted the idea of "language instinct" but opted for a "cognitive empiricist" view where language was just a labeling of cognitive structures until attainment of "formal operations" when it became the basis of cognitive structure itself.)

two "tricks" of language

1. arbitrary signs

2. grammar--infinite use of finite means

(Note the quote from W. von Humboldt!! Recall discussion of Chomsky above.)
(brief aside on language history? Who is who? above)

discrete combinatorial systems

"If a speaker is interrupted at a random point in a sentence, there are on average about ten different words that could be inserted to continue the sentence in a grammatical and meaningful way. 85"
...at least 1020 (a hundred million trillion sentences)
proof there's no longest sentence
"The number N of different strings of length L formed from a repertoire (an alphabet or vocabulary) of A alternative units is N = AL"
Note that this is independent of any recursion; it is the number of possible signals of a given length possible with A elements.
(George Miller discussed much of this in the Human Language-1 video.)

independence from cognition

There are both strings that can be interpreted which are not English and strings that seem English but have no plausible interpretation. p.87-8

possible models for human language

finite state grammars 91--wrong!

1.probabilities not relevant

Learning sentences is not just learning word orders-what Word follows what Word.

2. an overriding plan for sentences --"trees"

Chomsky's embedded clauses, p.94 and, phrase structure, & recursion. (examples 97-99)
S->NP+VP
NP->det+N+(S)
VP->aux+MV+NP
(as some of you know, linguists may represent these "tree" structures in a variety of formal ways. I'm using the most elementary, used by Chomsky (1957) in Syntactic Structures. The recent X-bar theory attempts to even more abstractly represent the idea of a phrase.

3 types of embedded sentences

(For more examples and discussion than we need at the moment see:
(recursive sentences, complex sentences)
All complex sentences have > one clause (S)

Conjunctions

S1 or/and/because.. S2

Relative clauses

A NP can have a modifying clause inside itself (NP->det+N+(S))
I want the computer [[pro] weighs 1 pound].
I want the computer [0] Steve has [pro].

Complement clauses

A 'complement" completes a phrase or clause.
I believe that [S]
He was aware of the fact that the world is round.

An ambiguous complex sentence

The fact that Otto knew was surprising.

linguistic levels of description

ambiguous strings of words

an utterance is ambiguous when there is more than one possible structural description(SD) that the grammar can "give" to that utterance.
For example: HE slept on the bank.
The professor ordered the Dean to stop drinking.
He likes old men and women.

Other levels of description

Use "Stop!" as example.

Acoustic
Phonological

Phonetics
Phoneme segments
Syllables

Morphology or lexical
Syntactic
Semantic
Pragmatic and referential

Chomsky's X-bar theory of Universal G (UG)

The idea is to formulate a theory of G that works equally well for any human L; hence a kind of definitive statement of what a possible human L is. (See Goals above) In practice there are two main considerations; first abstract general principles from specific language descriptions. At the same time, keep these UG principles "simple."
The X is to indicate a variable, where X can be V or N (and maybe prepositions and adjectives), in any language.

parts of speech: nouns and verbs

relation between these & concepts

"There is a connection...but it is subtle and abstract..106
"nouns are often the names for things and verbs for actions but the human mind can construe things in various ways....

phrase anatomy: the head and its "roles" 108-9

the phrase is about the "head" (x-bar)
combinatorial semantics

A working assumption is that elemental components of lexical meaning are syntactically organized and fused into an overall phrase meaning.

heads, arguments, adjuncts
verb and noun arguments ("role-player")
adjuncts (modifiers)
subjects

Languages differ in ordering of components of phrases

"Trees become mobiles." 111

parameters

"The piece of information that makes one language different from another is called a parameter. 111"
Learning a language is in part learning the parameters; the principles governing the structures are presumably innate, 112

pronouns, anaphors, and coreference ("Binding")

The term "binding" refers to the conditions interrelating NPs in a sentence and their referents (and sometimes beyond in into texts, discourse, etc. --though typically this is not seen as a grammatical matter.) This is part of the fundamental referential apparatus of human language.
Two NPs co-refer (are co-referents) if they both refer to the same entity.
Anaphors are NPs that somehow pick up reference with something else in the sentence.

examples

Bill cut him.
Bill cut himself.
Bill believes that Otto cut him.

Empty categories and traces

Not all structures are pronounced or visible. They must be inferred by listeners from other evidence. Languages may differ in this respect ("parameter").

NP and wh-traces

PRO

I want [pro] to visit him.

(Minimalist notes (Chomsky, 1992))

(skip this unless you are interested in Chomsky's own words on the topics of ch.4. Note particularly the last paragraph.)
"some basic properties of language are unusual among biological systems, notably the property of discrete infinity. p.2"
"The language is embedded in performance systems that enable its expressions to be used for articulating, interpreting, referring, inquiring, reflecting and other actions. We can think of the SD as a complex of instructions for these performance systems, providing information relevant to their functions..The performance systems fall into two general types: articulatory-perceptual and conceptual-intentional....Two of the linguistic levels, then, are the interface levels A-P and C-I, providing the instructions for the articulatory-perceptual and conceptual-intentional systems, respectively.
"Another standard assumption is that a language consists of two components: a lexicon and a computational system. The lexicon specifies the items that enter into the computational system, with their idiosyncratic properties. The computational sytem uses these elements to generate derivations and SDs
"UG is concerned with the invariant principles of So (the initial state) and the range of possible variation. Variation must be determined by what is "visible" to the child acquiring language (the primary linguistic data....Constructions such as verb phrase, relative clause, passive, etc., remain only as taxonomic artifacts, collections of phenomena explained through the interaction of the principles of UG, with the values of parameters fixed. p.5"
(JL) The above implies that phrase structure itself is derived and that English rules like VP->aux +V+(NP) are not necessary parts of linguistic knowledge but instead follow from UG and the English lexicon and specific English "parameters." For example generally if X is the head of X' (e.g.N head of NP), then for English X precedes its complements.