Usage Based Linguistics
Usage Based Linguistics
Throughout the 20th century, structuralist and generative linguists have argued that the
study of the language system (langue, competence) must be separated from the study of
language use (parole, performance), but this view of language has been called into ques
tion by usage-based linguists who have argued that the structure and organization of a
speaker’s linguistic knowledge is the product of language use or performance. On this ac
count, language is seen as a dynamic system of fluid categories and flexible constraints
that are constantly restructured and reorganized under the pressure of domain-general
cognitive processes that are not only involved in the use of language but also in other
cognitive phenomena such as vision and (joint) attention. The general goal of usage-based
linguistics is to develop a framework for the analysis of the emergence of linguistic struc
ture and meaning.
In order to understand the dynamics of the language system, usage-based linguists study
how languages evolve, both in history and language acquisition. One aspect that plays an
important role in this approach is frequency of occurrence. As frequency strengthens the
representation of linguistic elements in memory, it facilitates the activation and process
ing of words, categories, and constructions, which in turn can have long-lasting effects on
the development and organization of the linguistic system. A second aspect that has been
very prominent in the usage-based study of grammar concerns the relationship between
lexical and structural knowledge. Since abstract representations of linguistic structure
are derived from language users’ experience with concrete linguistic tokens, grammatical
patterns are generally associated with particular lexical expressions.
Keywords: usage-based linguistics, emergence, network, frequency, cognitive processes, social cognition, lexical
specificity, linguistic productivity, constructions, schemas
Page 1 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
The research program of usage-based linguistics stands in sharp contrast to the struc
turalist and generative approach. Ever since Saussure, the study of the linguistic system
has been separated from the study of language use or performance. In the classic version
of generative grammar, language, notably grammar, is primarily seen as a computational
system rather than an instrument of communication (Chomsky, 1965). Building on this
view, grammar is commonly analyzed by a set of primitive categories and concatenating
rules which, according to Chomsky (1986) and other generative scholars, are biologically
predetermined by a particular faculty of the mind (Pinker, 1994; Pinker & Jackendoff,
2005).
Usage-based linguists reject the innateness hypothesis of generative grammar and with it
the traditional distinction between grammar and usage, or competence and performance.
In this approach, language consists of fluid structures and probabilistic constraints that
are shaped by communication, memory, and processing. Challenging the widespread as
sumption that linguistic structure is built from a predefined set of innate linguistic con
cepts, usage-based linguists conceive of language as a dynamic network in which the var
ious aspects of a language user’s linguistic knowledge are constantly restructured and re
organized under the continuous pressure of performance. In order to understand the
(synchronic) organization of the linguistic system, usage-based linguists study how lan
guages evolve, both in history and acquisition.
One aspect that plays an important role in the usage-based analysis of linguistic structure
and meaning is frequency of occurrence. As frequency strengthens the representation of
linguistic elements in memory, it facilitates the activation and processing of words, cate
Page 2 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
gories, and constructions, which in turn can have long-lasting effects on the organization
of linguistic knowledge in the language network.
A second aspect that is of central significance to the usage-based study of language con
cerns the relationship between lexical and grammatical knowledge. Since abstract repre
sentations of grammatical structure are derived from language users’ experience with
particular words and utterances, there is a close connection between lexical and gram
matical knowledge in the usage-based model of grammar. In the structuralist approach,
linguistic structure is assumed to be independent of particular lexical expressions; but in
the usage-based approach, syntactic structures are lexically particular.
The usage-based approach has evolved from earlier research in functional and cognitive
linguistics which has emphasized the importance of pragmatic and conceptual factors for
the emergence of language structure and meaning (Givón, 1979; Hopper & Thompson,
1980; Talmy, 1983; Langacker, 1987; Lakoff, 1987), but in more recent research the focus
of analysis has shifted to the effects of frequency and processing on the development and
organization of linguistic knowledge (Arnon & Snider, 2010; Bybee & Hopper, 2001; By
bee, 2006, 2007, 2010; Goldberg, 2006; Hay, 2001; Krug, 2003).
This article provides an overview of some central themes of current research in usage-
based linguistics. The article consists of two main parts. Part one is concerned with the
network architecture of language, which provides a general framework for the analysis of
linguistic knowledge; and part two describes some of the interactive and cognitive
processes that have been proposed in the linguistic and psycholinguistic literature to ex
plain how linguistic knowledge is shaped by communication and processing.
Page 3 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
the copular be, and an adjective that describes the preceding nominal (NP–be–ADJEC
TIVE).
In the classic version of generative grammar, morphological and syntactic structures are
derived from primitive categories and concatenating rules (e.g., NP → DET ADJ N), but
there is good evidence that structures such as [VERB–er] and [NP–be–ADJECTIVE] are
stored and processed as holistic grammatical patterns that evoke particular semantic rep
resentations irrespective of the words they include (see Goldberg, 2006, pp. 6–9 for dis
cussion).
To begin with, Bybee (1985) proposed a network model of morphology in which words,
rather than morphemes, are the basic units of analysis (see also Aronoff, 1994). In this
model, affixes are represented together with a base, and complex words (or morphologi
cal constructions) are structured by lexical connections that indicate overlapping parts
between words of the same paradigm or morphological family. Consider, for instance, the
graph in Figure 1 which is very similar to network representations in Bybee (1985, 1988,
1995, 2001) and Hay and Baayen (2005).
Page 4 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
As can be seen, morphologically complex words with overlapping parts are related by as
sociative connections that mark them as members of a particular morphological class
(e.g., the class of regular past tense verbs). In addition, speakers may represent general
izations across groups of connected words in a morphological schema ([re __ ]v, [ __ ed]v);
but there is good evidence that (frequent) words are stored together with bound mor
phemes as prefabricated units (see Bybee, 1985, 1995; Sereno & Jongman, 1999).
One general advantage of this approach is that morphological structure is analyzed with
in the same general network model as associations between semantically related lexemes
(cow–farm), words of suppletive paradigms (go–went), words that alliterate (fry–free–
frozen) or rhyme (hat–cat–rat), and phonesthemes (glow-glitter-glisten). All of these phe
nomena involve associative connections between semantically and/or phonologically relat
ed expressions that are evident in psycholinguistic experiments. Note that the strength of
lexical connections varies on a continuum, which is easily explained in a dynamic network
model by assigning different weights to particular connections.
(1)
(2)
(3)
(4)
(5)
Relative clauses are subordinate clauses that modify a noun in the main clause, which
serves a particular semanto-syntactic role in the relative clause. Subject and nonsubject
Page 5 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
relative clauses are distinguished by word order and the optional “omission” of the rela
tive marker in nonsubject relative clauses (cf. 6 and 7).
(6)
(7)
The latter (i.e., nonsubject RCs) comprise object, oblique, and genitive relative clauses,
which are differentiated by the use of different pronouns (whom vs. who vs. whose), adpo
sitions (which vs. of which), verb valency (transitive vs. intransitive), and the “omission”
of the “relativized noun” (i.e., the semantic referent that is coreferential with the noun
being modified by the relative clause). The various types of relative clauses constitute a
hierarchical network of constructions ranging from lexicalized structures at the bottom of
the network (e.g., All I know, The way I am) to highly abstract representations at the top
(Figure 2).
Children acquire the hierarchical network of relative clause constructions (and other
grammatical pattern) in a piecemeal, bottom-up fashion whereby they “construct” in
creasingly more schematic representations of relative clause constructions that enable
mature language learners to produce novel relative clauses, that is, relative clauses they
have never heard or used before (Diessel, 2009).
Page 6 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
The hierarchical organization of constructions has been a central topic of usage-based re
search on language acquisition (Tomasello, 2003; Goldberg, 2006) and language change
(Hilpert, 2013; Traugott & Trousdale, 2013), but constructions are not only taxonomically
related. There are also associative connections between constructions with overlapping
and contrastive features (similar to complex words in morphological paradigms; see Fig
ure 1). Content questions, for instance, share a number of properties with relative claus
es, which can be explained by analogical connections between constructions in the gram
mar network. As can be seen in Figure 3, both clause types begin with a wh-word, they
differentiate subject from nonsubject roles by linear order, and they both occur with
stranded prepositions.
Like questions and relative clauses, many other syntactic patterns are interconnected. Ac
tive and passive sentences, for instance, form a pair of constructions that present a
causative event from different perspectives (cf. 8 and 9) (Langacker, 1991, §4), and pur
pose infinitive clauses share formal and semantic properties with infinitival complement
clauses (cf. 10 and 11) (Schmidtke-Bode, 2009, pp. 157–165).
(8)
(9)
(10)
(11)
Page 7 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
3. Cognitive Processes
A second principle that is of fundamental significance to usage-based linguistics is that
language use and language development are driven by the same cognitive processes as
other, nonlinguistic forms of cognition and social behavior. Since many usage-based lin
guists have stressed the importance of frequency for the emergence of linguistic knowl
edge, there has been a tendency to associate usage-based linguistics with the analysis of
memory-related processes (which are immediately determined by frequency of occur
rence), but memory is not the only factor that affects speakers’ linguistic behavior. There
is general consensus among usage-based linguists that language use involves a wide
range of cognitive and social processes, which may be divided into three general do
mains, namely the domains of (i) social cognition (cf. §3.1), (ii) conceptualization (cf.
§3.2), and (iii) memory and processing (cf. §3.3).
Language use is a particular form of social interaction, which involves a set of cognitive
processes that concern the ability to take other persons’ knowledge, intentions, and be
liefs into account (Clark, 1996; Tomasello, 2003). This ability, which is often characterized
as a uniquely human capacity (Tomasello, 1999), is of central significance to both lan
guage use and language development.
A basic form of social cognition is joint attention (Carpenter, Tomasello, & Savage-Rum
baugh, 1998; Tomasello, 1999; Eilan, Hoerl, McCormack, & Roessler, 2005). In order to
communicate, the interlocutors must focus their attention on the same experience, which
may involve an object or event in the surrounding situation or a concept that is evoked by
the preceding discourse. In face-to-face conversation, joint attention is commonly estab
lished by nonverbal means of communication such as eye gaze, head movement, and ges
ture. Of particular importance is deictic pointing—a communicative device that is univer
sally available to establish joint attention and that is commonly accompanied by demon
stratives (or spatial deictics) (Bühler, 1934; Diessel, 2006).
Joint attention is a prerequisite for social interaction, but communication involves more
than a shared focus of attention. In order to communicate, the interlocutors have to align
their knowledge and beliefs, that is, they have to establish a common ground that is avail
able as a background for the interpretation of novel information (Clark & Brennan, 1991;
Clark, 1996). Common ground provides the basis for what some psychologists call “audi
ence design,” which is the process whereby speakers seek to construct a sentence ac
cording to what they think the hearer “needs” in order to understand their communica
tive intention in a particular situation (Clark & Marshall, 1981; see also Horton & Gerrig,
2005).
To illustrate, all languages have multiple types of referring expressions—definite and in
definite NPs (a/the boy), proper names (John), demonstrates (that one), third person pro
nouns (he), and zero anaphors (Gundel, Hedberg, & Zacharski, 1993). Functional lin
Page 8 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
guists have shown that the occurrence of the various types of referring expressions corre
lates with aspects of the linguistic and nonlinguistic context (Givón, 1984; Ariel, 1990;
Chafe, 1994); but from a cognitive perspective we may say that speakers choose a partic
ular term based on what they think the listener knows and sees, and listeners interpret
the chosen expressions based on the assumption that speakers construct sentences ac
cording to this strategy (see Arnold, 2008, for a review). In other words, the choice and
interpretation of linguistic expressions is crucially influenced by the interlocutors’ assess
ment of common ground.
What is more, joint attention and common ground are also important for language acqui
sition and language change. As Tomasello and colleagues have shown, the ability to en
gage in social interactions evolves only gradually in early childhood (Carpenter et al.,
1998; Tomasello, 2003). While infants respond to adults’ actions from early on, it is only
around the first birthday that they begin to understand the communicative functions of
pointing and eye gaze and the role of intentions, which, according to Tomasello is a pre
requisite for language learning. In order to understand a (linguistic) symbol, the child
must be able to recognize that language is used for particular purposes and that the
(communicative) actions of adults are driven by intentions.
Moreover, there is good evidence that the diachronic development of grammatical mark
ers and constructions is influenced by the communicative pressure to coordinate the in
terlocutors’ attention and knowledge. For instance, the frequent development of demon
stratives into grammatical markers can be explained by their communicative function to
establish a joint focus of attention (Diessel, 2006). In their basic use, demonstratives refer
to objects and events in the surrounding situation, but, as can be seen in (12) and (13),
they can also refer to linguistic elements in discourse.
(12)
(13)
Starting from this use, demonstratives are frequently reanalyzed as definite articles, third
person pronouns, topic markers, sentence connectives, and a wide range of other gram
matical function words (Diessel, 2006), which is arguably motivated by their communica
tive function to focus the interlocutors’ attention on linguistic elements in the unfolding
speech stream (see also Bühler, 1934; Diessel, 2012A).
Page 9 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
(14)
(15)
(16)
In general, joint attention and common ground are domain-general cognitive phenomena
that are foundational to communication and language. They influence the language users’
linguistic decisions and choices in both speaking and listening and motivate the develop
ment of grammatical markers and constructions that serve to enhance discourse coher
ence through the coordination of (shared) knowledge and attention.
3.2 Conceptualization
Like all other cognitive processes of language use, conceptualization is not specific to lan
guage. In fact, the conceptual approach to semantics is inspired by general psychological
research on vision. Pioneering research on conceptualization comes from gestalt psychol
ogy, which had a strong impact on conceptual semantics (Talmy, 1983; Langacker, 1987).
The gestalt psychologists showed that vision involves more than the passive recording of
sensory cues—that visual perception is guided by general cognitive principles such as the
segregation of figure and ground and reification (which is the enrichment of perceptual
information through inference).
Page 10 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Langacker (1991, p. 117) argues that there are always multiple ways of viewing and de
scribing the same experience (see also Croft & Cruse, 2004, §3). The analysis of alterna
tive descriptions provides a useful strategy to illustrate that (linguistic) meaning resides
in the cognitive structuring of sensory experience. Consider, for instance, the use of come
and go in (17) and (18).
(17)
(18)
Come and go are deictic verbs that can often be used with reference to the same scene,
but they describe the scene from different perspectives. In the case of come, the concep
tual figure is moving toward the observer; but in the case of go, the figure is moving away
from the observer (Figure 4).
Both come and go are interpreted relative to a particular point of reference, the deictic
center, which is the origin of a coordinate system that is usually grounded by the
speaker’s location at the time of the utterance; but the deictic center can be shifted from
the speaker to another person, or fictive observer, providing additional evidence for the
view that meaning is constructed by conceptualization (Diessel, 2014).
Like words, constructions involve conceptualization. Consider, for instance, the active–
passive alternation in examples (19) and (20).
(19)
(20)
An active sentence construes a scene from the perspective of the agent. In sentence (19),
the agent is in the focus of attention and the patient is backgrounded relative to the
agent, but in the passive sentence in (20) it is the other way around. In this case, the pa
tient serves as figure and the agent is a secondary focal point (Langacker, 1991, §3),
which can be “omitted,” but, of course, conceptually, the passive construction entails an
Page 11 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
agent or agentive force (Figure 5). Analyzing grammatical relations in this way creates an
explicit link between argument structure and general conceptual processes.
To give one more example, in languages with perfective and imperfective aspect, action
verbs can be conceptualized in two different ways: as ongoing (imperfective) actions
(e.g., He was writing a book) or as completed (perfective) actions (e.g., He has written a
book). One feature that distinguishes perfective from imperfective aspect is conceptual
boundedness (Langacker, 1987, pp. 86–87). Perfective events are temporally bounded,
whereas imperfective events are unbounded (Figure 6). Of course, every event has a be
ginning and an ending, but perfective verb forms construe an event as temporally bound
ed, whereas imperfective verb forms present the same event as ongoing and expansible
(Talmy, 2000, pp. 50–62).
In general, in the usage-based approach, semantic conventions are emergent from recur
rent conceptualizations of the same or similar experiences (or as Langacker, 1987, p. 99,
put it: “semantic structure is conceptualization tailored to the specifics of linguistic con
vention”). What is more, conceptualization is not only the driving force behind the “con
struction” of meaning, it also plays an important role in the diachronic development of
grammar. In particular, the early stages of grammaticalization are generally motivated by
conceptual processes, notably by metaphor and metonymy (Heine, Claudi, & Hünnemey
er, 1991) and the projection of the deictic center (Diessel, 2012A).
The paradigm example of grammaticalization is the English expression be going to, which
has developed from a motion verb into a future tense marker, or future tense auxiliary, as
evidenced by the fact that be going to (or the contracted form gonna) can be used with a
semantically empty, nonmoving subject to indicate future (It’s gonna rain). Like English,
many other languages have future tense auxiliaries derived from motion verbs, which is,
of course, related to the fact that time is commonly conceptualized in terms of space and
motion (similar conceptual processes occur in L1 acquisition; see Diessel, 2011B, 2012B).
Page 12 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
If deictic expressions of this use could speak, “they would speak as follows: look
ahead or back along the band of the present utterance. There something will be found
that actually belongs here, where I am, so that it can be connected with what now fol
lows. Or the other way round: what comes after me belongs there, it was only dis
placed from that position for relief.” (Bühler, 1934; English translation from Goodwin,
1990, p. 443)
Some usage-based linguists refer to exemplar theory to explain the role of frequency in
language (Bybee, 2006; Abbot-Smith & Tomasello, 2006; Goldberg, 2006). Exemplar theo
ry has been developed by cognitive psychologists as a general cognitive model of catego
Page 13 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
rization and concept learning (Medin & Schaffer, 1978; Nosofsky, 1988). In this approach,
concepts are formed from tokens with similar properties that together provide a cognitive
reference point for the classification of novel experiences, or novel tokens. As a conse
quence of experience-based learning, concepts are linked to individual memory traces
and categorization does not always draw on high-level generalizations, but often involves
knowledge of particular experiences or local clusters of similar tokens (see Murphy, 2002,
for discussion).
Exemplar theory has been especially influential in research on phonetics and phonology
(cf. Johnson, 1997; Bybee, 2001; Pierrehumbert, 2001), where speech-sound categories
such as the (English) vowel phonemes /ɛ/ and /ɔ/ are emergent from many slightly differ
ent phonetic tokens that a language user encounters in experience (Figure 8). If a new
phonetic token is encountered, it is categorized according to its similarity to stored to
kens (or the entire token cluster).
Building on this analysis, Bybee (2006) and other usage-based linguists have argued that
the exemplar approach can also be applied to morphology and syntax (Goldberg, 2006;
Bod, 2009). Specifically, these researchers suggest that grammatical constructions are
emergent from the language users’ experience with strings of lexical tokens and that the
cognitive representations of grammatical structure are often associated with particular
lexical expressions. On this view, knowledge of grammar includes a great deal of item-
specific information (see Diessel, 2016, for discussion).
3.3.2 Automatization
Automatization is a general cognitive mechanism whereby a string of distinct elements is
transformed into a processing unit (Logan, 1988; Schneider & Chein, 2003). Langacker
(2008, pp. 60–73) uses the notion of “unit” as a technical term for automated sequences
that are internally structured but activated and executed as integrated wholes (see also
Langacker, 1987, pp. 494). Bybee (2010, pp. 8) refers to units as chunks and to the
process of unit formation as chunking: “Chunking is the process by which sequences of
units that are used together cohere to form more complex units” (see also Bybee, 2002).
Units, or chunks, are cognitive routines that concern both motor actions such as dancing
and cognitive activities such as counting or reciting the alphabet (Langacker, 2008, pp.
16–17; see also Diessel, 2016).
Page 14 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
count, as a matter of fact, I was wondering if, to be about to). For many linguistic schol
ars, idioms constitute a small class of irregular expressions that are listed in the mental
lexicon and excluded from grammatical analysis, but in the usage-based approach id
iomaticity is seen as a continuum that concerns a wide range of formulaic expressions
(Fillmore, Kay, & O’Connor, 1988) shaped by automatization (in conjunction with general
conceptual processes such as metaphor and categorization).
What is more, automatization is not only the driving force behind the emergence of for
mulaic sequences, it is also an important determinant of phonetic reduction and fusion.
There is now an extensive body of research indicating that frequent word strings are
more likely to undergo phonetic reduction than infrequent word combinations (Bell, Juraf
sky, Fosler-Lussier, Girand, Gregory, & Gildea, 2003; Bell, Brenier, Gregory, Girand, & Ju
rafsky, 2009; Bybee, 1985, 2001; Jurafsky, Bell, Gregory, & Raymond, 2001), which, ac
cording to Bybee (2010, pp. 37–34), is primarily caused by automatization or chunking
(see also Bybee, 2001, pp. 73–74). Note that the reduction effect of automatization con
cerns both motor movement, that is, the production of articulatory gestures, and lexical
access, that is, the activation of linguistic knowledge. Linguistic expressions that are com
monly reduced in speech production may lose their status as independent words and may
develop into affixes. There is a well-known developmental path leading from independent
words via clitics to bound morphemes (Givón, 1979) that correlates with frequency of oc
currence. In this way, automatization is one of the cognitive processes that shapes the
morphological structure of language (Bybee, 1985, 2001; Krug, 1998).
Crucially, automatization increases the amount of information that can be held in working
memory. At any given moment in time, the human processor can focus on only a few items
(Cowan, 2005, §3), but since these items often consist of prefabricated chunks that are in
ternally structured and hierarchically organized, it is possible to integrate large amounts
of information into the units that are currently activated and processed (Miller, 1956; see
also Cowan, 2005, for a review of recent research on this topic). A sentence, for instance,
can be seen as a schematic processing unit that consists of a limited number of syntactic
chunks—arguments and adjuncts—that are related to lexical chunks—words and colloca
tions—which in turn consist of automated sequences of articulatory gestures. In this view,
Page 15 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
the hierarchical organization of syntax is to a large extent a consequence of the fact that
syntactic structure consists of prefabricated chunks, both lexical chunks (words and collo
cations) and schematic chunks (syntactic constituents and sentences), that have been
shaped by automatization.
3.3.3 Analogy
The notion of analogy is used in many different ways by different scholars. In historical
linguistics, analogy is often used as a descriptive term for a certain type of structural
change, notably morphological change (Trask, 1996, pp. 105–115), but in usage-based lin
guistics, analogy is a domain-general cognitive phenomenon that accounts for the produc
tive use of language (Bybee & Moder, 1983; Barðdal, 2008) as well as certain types of lan
guage change (Bybee, 2010, §4) and language acquisition (Diessel, 2013).
Since constructional schemas are emergent from the language users’ experience with
concrete words and utterances, they are associated with particular lexical expressions
(§3.3.1); but they can be extended to new items by analogy. Two general factors influence
the analogical extension of a constructional schema to novel expressions: (i) the activa
tion strength of a schema in memory and (ii) the similarity between lexical expressions
that appear in a schema. To illustrate, let us consider the formation of the English past
tense, which has been at the center of the debate about analogy and rules (Bybee, 1995;
Pinker, 1999).
In the generative approach, the regular past tense is formed by a concatenating rule that
combines the suffix -ed with a verb stem (Pinker, 1999); but in the usage-based approach,
it is analyzed as a constructional schema that competes with several other, irregular
schemas to form the past tense (Bybee, 1995). The irregular past tense schemas are de
fined by particular phonetic forms that are associated with phonetically similar present
tense forms. There are several classes of related irregular present and past tense forms:
sing–sang, swim–swam, fly–flew, lend–lent, hit–hit.
Since the regular past tense occurs with a very large number of verb types, it is deeply
entrenched in memory and commonly selected to form the past tense of novel verbs, as,
for instance, in the case of faxed, emailed, and googled. However, as Bybee and Moder
(1983) have demonstrated, if a novel verb is phonetically similar to an irregular verb,
speakers may choose an irregular schema to form the past tense. Using a nonce word
task, they found that people often produce irregular past tense forms, which they have
Page 16 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
never heard before, when asked to provide the past tense of the following base forms:
spim → [spæm], shink → [ʃʌŋk], spling → [splʌŋ].
Pinker (1999) argued that regular and irregular past tense forms are produced by differ
ent cognitive mechanisms. Specifically, he claimed that while irregular past tense forms
are created by analogy, regular past tense forms are produced by a concatenating rule.
However, challenging Pinker’s “dual-mechanism” account, Bybee (1995) argued that the
regular past tense constitutes an “open schema” that is automatically activated as a de
fault to form the past tense unless a verb is drawn to an irregular schema because of its
phonetic form. On this account, regular and irregular past tense forms are produced by a
single cognitive mechanism of pattern matching or analogy. In accordance with this view,
cognitive scientists have successfully simulated speakers’ choice of regular and irregular
past tense forms in connectionist network models that learn how to map a given input
pattern (i.e., a particular base form) onto a particular output pattern (i.e., a particular
past tense from) from processing linguistic data (Rumelhart & McClelland, 1986; Plunkett
& Marchman, 1993).
(21)
(22)
Although fall and dead are exclusively used as intransitive verbs in adult language, it is
easy to see why children use them in the transitive construction in which the subject of
the intransitive use serves as patient and object of a causative event. Since fall and dead
are semantically similar to causative verbs such as drop and kill, they are readily accom
modated to the transitive construction, given that many English verbs of this semantic
type occur in both transitive and intransitive constructions (see Diessel, 2013, for further
discussion).
Related to this finding, Boas (2008) observed that the productivity of syntactic construc
tions in adult language is constrained by semantic criteria. Specifically, he argued that
the likelihood of a construction to be extended to a new verb by analogy is contingent on
the semantic relationship between the new verb and the verbs that are routinely used in
the construction. The intransitive verb sneeze, for instance, is readily acceptable in the
caused-motion construction, as in Goldberg’s famous example She sneezed the napkin off
the table, because sneeze is semantically similar to blow, which is well established in the
Page 17 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
caused-motion construction (cf. The wind blew the leaves around the yard). Other intran
sitive verbs that are semantically more distantly related to verbs of the caused-motion
schema are not so easily coerced into this construction.
3.3.4 Priming
Priming is the process by which the activation of information in memory is facilitated
through the previous activation of the same or related information. Although priming can
occur with all kinds of information, linguistic and nonlinguistic, most research on priming
is concerned with language. Two general types of (language) priming can be distin
guished: (i) lexical priming and (ii) relational (or structural) priming.
Lexical priming refers to the facilitatory (or inhibitory) effect of a lexical item, the prime,
on the activation of a related item, the target. Lexical priming can involve both the mean
ing and form of lexical expressions. For instance, people are faster and more accurate in
identifying a word such as dog if the word is preceded by a semantically related item such
as cat than if it is preceded by an unrelated word such as city. There is also evidence that
the phonetic features of a word affect the activation of phonetically related expressions
(that rhyme or alliterate with the prime) and that repetition speeds up lexical access and
word recognition (Harley, 2001, pp. 145–150).
Like lexical priming, relational priming is an implicit memory effect that concerns the ac
tivation of knowledge; but relational priming has to do with structure rather than with
lexical items. Relational priming has become a central topic of psycholinguistic research
on language production and learning (see Pickering & Ferreira, 2008, for a review).
One of the earliest and most influential studies on relational priming is Bock (1986), who
showed that people are more likely to describe a ditransitive scene depicting an act of
transfer by the to-dative construction (She gave the book to John), rather than the (relat
ed) double object construction (She gave John the book), if they had used the to-dative
construction prior to the experimental task. Parallel results were obtained for the active–
passive alternation and other related clause types. Interestingly, while most priming ex
periments involve the same sentence types as prime and target, Bock and Loebell (1990)
Page 18 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
observed that there are also priming effects between distinct constructions that share
some of their structural properties. For instance, in one of their studies they found that
active sentences with a locative by-phrase prime passive sentences with an agentive
by-phrase and vice versa:
(23)
(24)
Bock and colleagues emphasized that relational priming concerns syntactic structure; but
later research showed that relational priming is significantly enhanced if prime and tar
get include the same content words. Pickering and Ferreira (2008) call this the “lexical
boost” of relational priming, which was first noticed in a study by Pickering and Branigan
(1998). Using a sentence completion task, these researchers found a much stronger prim
ing effect if prime and target included the same verb than if they included different verbs.
This finding was replicated by other experimental studies (Pickering and Ferreira, 2008)
and confirmed by corpus investigations (Gries, 2005; Szmrecsanyi, 2006). Interestingly,
Szmrecsanyi (2006) argues, based on corpus data, that lexical expressions can prime the
occurrence of a particular construction even if the prime sentence does not have the
same structure as the prime target. The motion verb go, for instance, primes speaker’s
choice of the be-going-to future, as opposed to other future tense forms (e.g., will do), al
though the intransitive verb go and the be-going-to future are embedded in different con
structions.
Generalizing across these findings, we may say that priming, both lexical priming and re
lational priming, provides strong evidence for the network architecture of language. Of
particular importance is the lexical boost of relational priming as it suggests that struc
tural patterns are associated with lexical expressions—that constructions are lexically
particular (Pickering & Branigan, 1998).
What is more, recent research has argued that priming, notably relational priming, is an
important mechanism of language learning (Bock & Griffin, 2000; Kaschak & Borreggine,
2008). Although priming is commonly characterized as a short-term phenomenon, these
studies observed that relational priming can have long-lasting effects on (adult) speakers’
linguistic behaviour, which may be seen as some kind of “implicit learning” (Chang, Dell,
& Bock, 2006). In accordance with this view, research with children showed that structur
al repetition is a conspicuous property of early child language and that young children
are extremely sensitive to recurrent structural patterns, especially when these patterns
are reinforced by lexical expressions (Savage, Lieven, Theakston, & Tomasello, 2006;
Rowland, Chang, Ambridge, Pine, & Lieven, 2012). Taken together, these studies suggest
Page 19 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
that priming, just like all other cognitive processes described in this article, have a signif
icant impact on both usage and language development.
4. Conclusion
To conclude, this article has reviewed linguistic and psycholinguistic research on the us
age-based model. Challenging longstanding assumptions of linguistic analysis, usage-
based scholars conceive of language (and grammar) as a dynamic network of interrelated
lexemes and constructions that are in principle always changing under the continuous in
fluence of domain-general processes of language use. Combining research from function
al and cognitive linguistics with research from cognitive psychology, the article has given
a comprehensive overview of cognitive processes from three general domains, namely the
domains of social cognition, conceptualization, and memory and processing, and has ex
plained how the various processes affect linguistic behavior and language development in
both L1 acquisition and language change.
Further Reading
Arnon, I., & Snider, N. (2010). More than words: Frequency effects for multi-word phras
es. Journal of Memory and Language, 62, 67–87.
Beckner, C., Blythe, R., Bybee, J., Christiansen, M. H., Croft, W., … Schoenemann, T.
(2009). Language is a complex adaptive system: Position paper. Language Learning, 59,
1–26.
Bybee, J. (1985). Morphology: A study on the relation between meaning and form. Amster
dam: John Benjamins.
Bybee, J. (1995). Regular morphology and the lexicon. Language and Cognitive Processes,
10, 425–455.
Bybee, J. (2001). Phonology and language use. Cambridge, U.K.: Cambridge University
Press.
Bybee, J. (2007). Frequency of use and the organization of language. Cambridge, U.K.:
Cambridge University.
Bybee, J. (2010). Language, usage, and cognition. Cambridge, U.K.: Cambridge University
Press.
Bybee, J., & Hopper, P. (Eds.) (2001). Frequency and the emergence of linguistic
structure. Amsterdam: John Benjamins.
Bybee, J., & Beckner, C. (2010). Usage-based theory. In B. Heine & H. Narrog (Eds.), The
Oxford handbook of linguistic analysis (pp. 827–856). Oxford: Oxford University Press.
Page 20 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Diessel, H. (2007). Frequency effects in language acquisition, language use, and di
achronic change. New Ideas in Psychology, 25, 108–127.
Hilpert, M. (2014). Construction grammar and its application to English. Edinburgh: Edin
burgh University Press.
Kemmer, S., & Barlow, M. (Eds.) (2000), Usage-based models of language. Stanford, CA:
Center for the Study of Language and Information.
References
Abbot-Smith, K., & Tomasello, M. (2006). Exemplar-learning and schematization in a us
age-based account of syntactic acquisition. The Linguistic Review, 23, 275–290.
Arnon, I., & Snider, N. (2010). More than words: Frequency effects for multi-word phras
es. Journal of Memory and Language, 62, 67–87.
Austin, J. (1962). How to do things with words. Cambridge, MA: Harvard University Press.
Barðdal, J. (2008). Productivity: Evidence from case and argument structure in Icelandic.
Amsterdam: John Benjamins.
Page 21 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Bates, E., & MacWhinney, B. (1989). Functionalism and the Competition Model. In B.
MacWhinney & E. Bates (Eds.), The crosslinguistic study of sentence processing (pp. 3–
73). Cambridge, U.K.: Cambridge University Press.
Beckner, C., Blythe, R., Bybee, J., Christiansen, M. H., Croft, W., Ellis, N. C., … Schoene
mann, T. (2009). Language is a complex adaptive system: Position paper. Language Learn
ing, 59, 1–26.
Bell, A., Jurafsky, D., Fosler-Lussier, E., Girand, C., Gregory, M., & Gildea, D. (2003). Ef
fects of disfluencies, predictability, and utterance position on word form variation in Eng
lish conversation. Journal of the Acoustical Society of America, 113, 1001–1024.
Bell, A., Brenier, J., Gregory, M., Girand, C., & Jurafsky, D. (2009). Predictability effects on
durations of content and function words in conversational English. Journal of Memory and
Language, 60, 92–111.
Boas, H. C. (2008). Determining the structure of lexical entries and grammatical con
structions in Construction Grammar. Annual Review of Cognitive Linguistics, 6, 113–144.
Bock, K., & Loebell, H. (1990). Framing sentences. Cognition, 35, 1–39.
Bock, K., & Griffin, Z. M. (2000). The persistent structural priming: Transient activation
or implicit learning. Journal of Experimental Psychology: General, 129, 177–192.
Bybee, J. (1995). Regular morphology and the lexicon. Language and Cognitive Processes,
10, 425–455.
Bybee, J. (2001). Phonology and language use. Cambridge, U.K.: Cambridge University
Press.
Page 22 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Bybee, J. (2006). From usage to grammar: The mind’s response to repetition. Language,
82, 711–733.
Bybee, J. (2007). Frequency of use and the organization of language. Cambridge, U.K.:
Cambridge University Press.
Bybee, J. (2010). Language, usage, and cognition. Cambridge, U.K.: Cambridge University
Press.
Bybee, J., & Moder, C. L. (1983). Morphological classes as natural categories. Language,
59, 251–270.
Bybee, J., & Scheibman, J. (1999). The effect of usage on degrees of constituency: The re
duction of don’t in English. Linguistics, 37, 575–596.
Bybee, J., & Hopper, P. (Eds.). (2001). Frequency and the emergence of linguistic struc
ture. Amsterdam: John Benjamins.
Carpenter, M., Tomasello, M., & Savage-Rumbaugh, S. (1998). Joint attention and imita
tive learning in children, chimpanzees and enculturated chimpanzees. Social Develop
ment, 4, 217–237.
Chafe, W. (1994). Discourse, consciousness, and time: The flow and displacement of con
scious experience in speaking and writing. Chicago: Chicago University Press.
Chang, F., Dell, G. S., & Bock, K. (2006). Becoming syntactic. Psychological Review, 113,
234–272.
Chomsky, N. (1986). Knowledge of language: Its nature, origin and use. Westport, CT:
Praeger.
Clark, H. H., & Marshall, C. R. (1981). Definite reference and mutual knowledge. In A. K.
Joshi, B. L. Webber, & I. S. Sag (Eds.), Elements of discourse understanding (pp. 10–63).
New York: Cambridge University Press.
Page 23 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Croft, W., & Cruse, D. A. (2004). Cognitive linguistics. Cambridge, U.K.: Cambridge Uni
versity Press.
Diessel, H. (2006). Demonstratives, joint attention, and the emergence of grammar. Cog
nitive Linguistics, 17, 463–489.
Diessel, H. (2007). Frequency effects in language acquisition, language use, and di
achronic change. New Ideas in Psychology, 25, 108–127.
Diessel, H. (2009). On the role of frequency and similarity in the acquisition of subject
and non-subject relative clauses. In T. Givón & M. Shibatani (Eds.), Syntactic complexity
(pp. 251–276). Amsterdam: John Benjamins.
Diessel, H. (2011a). Review article of “Language, usage and cognition” by Joan Bybee.
Language, 87, 830–844.
Diessel, H. (2012a). Buehler’s two-field theory of pointing and naming and the deictic ori
gins of grammatical morphemes. In T. Breban, L. Brems, K. Davidse, & T. Mortelmans
(Eds.), New perspectives on grammaticalization: Theoretical understanding and empirical
description (pp. 35–48). Amsterdam: John Benjamins.
Diessel, H. (2012b). Language change and language acquisition. In A. Bergs & L. Brinton
(Eds.), Historical linguistics of English: An international handbook (Vol. II, pp. 1599–
1613). Berlin: Mouton de Gruyter.
Diessel, H. (2016). “Frequency and lexical specificity”: A critical review. In H. Behrens &
S. Pfänder (Eds.), Experience counts: Frequency effects in language (pp. 209–237).
Berlin: Mouton de Gruyter.
Diessel, H., & Hilpert, M. (2016). Frequency effects in grammar. Oxford Research En
cyclopedia of Linguistics.
Page 24 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Eilan, N., Hoerl, C., McCormack, T., & Roessler, J. (Eds.). (2005). Joint attention: Commu
nication and other minds issues in philosophy and psychology. Oxford: Oxford University
Press.
Erman, B., & Warren, B. (2000). The idiom principle and the open choice principle. Text,
20, 29–62.
Fillmore, C. J., Kay, P., & O’Connor, C. (1988). Regularity and idiomaticity in grammatical
constructions: The case of let alone. Language, 64, 501–538.
Gundel, J. K., Hedberg, N., & Zacharski, R. (1993). Cognitive status and the form of refer
ring expressions in discourse. Language, 69, 274–307.
Harley, T. A. (2001). The psychology of language: From data to theory. 2d ed. Hove, U.K.:
Psychology Press.
Hay, J., & Baayen, H. (2005). Shifting paradigms: Gradient structure in morphology.
Trends in Cognitive Science 9, 342–348.
Page 25 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Heine, B., Claudi, U., & Hünnemeyer, F. (1991). Grammaticalization: A conceptual frame
work. Chicago: University of Chicago Press.
Hilpert, M. (2014). Construction grammar and its application to English. Edinburgh: Edin
burgh University Press.
Hopper, P., & Thompson, S. A. (1980). Transitivity in grammar and discourse. Language,
56, 251–299.
Horton, W. S., & Gerrig, R. J. (2005). The impact of memory demands on audience design
during language production. Cognition, 96, 127–142.
Jonides, J., Lewis, R. L., Nee, D. E., Lustig, C. A., Berman, M. G., & Moore, K. S. (2008).
The mind and brain of short-term memory. The Annual Review of Psychology, 59, 193–
224.
Jurafsky, D., Bell, A., Gregory, G., & Raymond, W. D. (2001). Probabilistic relations be
tween words: Evidence from reduction in lexical production. In J. Bybee & P. Hopper
(Eds.), Frequency and the emergence of linguistic structure (pp. 229–253). Amsterdam:
John Benjamins.
Kemmer, S., & Barlow, M. (Eds.) (2000). Usage-based models of language. Stanford, CA:
Center for the Study of Language and Information.
Lakoff, G., & Johnson, M. (1980). Metaphors we live by. Chicago: Chicago University
Press.
Lakoff, G. (1987). Women, fire, and dangerous things. Chicago: Chicago University Press.
Page 26 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Langacker, R. W. (1991). Concept, image, and symbol: The cognitive basis of grammar.
New York: Mouton de Gruyter.
MacDonald, M. C., Pearlmutter, N. J., & Seidenberg, M. S. (1994). Lexical nature of syn
tactic ambiguity resolution. Psychological Review, 101, 676–703.
MacDonald, M. C. (2013). How language production shapes language form and compre
hension. Frontiers in Psychology, 4, 1–16.
Medin, D. L., & Schaffer, M. M. (1978). Context theory and classification learning. Psycho
logical Review, 85, 207–238.
Miller, G. A. (1956). The magical number seven, plus or minus two: Some limitations on
our capacity for processing information. Psychological Review, 63, 81–97.
Pickering, M. J., & Branigan, H. P. (1998). The representation of verbs: Evidence from
syntactic priming in language production. Journal of Memory and Learning, 39, 633–651.
Pickering, M. J., & Ferreira, V. S. (2008). Structural priming: A critical review. Psychologi
cal Bulletin, 134, 427–459.
Pinker, S. (1994). The language instinct: How the mind creates language. New York:
Harper.
Page 27 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Pinker, S. (1999). Words and rules: The ingredients of language. New York: Basic Books.
Pinker, S., & Jackendoff, R. (2005). The faculty of language: What’s special about it? Cog
nition, 95, 201–236.
Plunkett, K., & Marchman, V. (1993). From rote-learning to system building: Acquiring
verb morphology in children and connectionist nets. Cognition, 48, 21–69.
Rowland, C., Chang, F., Ambridge, B., Pine, J., & Lieven, E. (2012). The development of
abstract syntax: Evidence from priming and the lexical boost. Cognition, 125, 49–63.
Rumelhart, D. E., & McClelland, J. L. (Eds.) (1986). Parallel distributed processing. Explo
ration in the microstructures of cognition. Vols. 2. Cambridge, MA: MIT.
Savage, C., Lieven, E. V. M., Theakston, A., & Tomasello, M. (2006). Structural priming as
implicit learning in language acquisition: The persistence of lexical and structural prim
ing in 4-year-olds. Language Learning and Language Development, 2, 27–49.
Schneider, W., & Chein, J. M., (2003). Controlled and automatic processing: From mecha
nisms to biology. Cognitive Science, 27, 525–559.
Sereno, J. A., & Jongman, A. (1999). Processing of English inflectional morphology. Memo
ry and Cognition, 25, 425–437.
Talmy, L. (1983). How language structure space. In H. L. Pick & L. P. Acredolo (Eds.), Spa
tial orientation: Theory, research, and application (pp. 225–282). New York: Plenum.
Tomasello, M. (1999). The cultural origins of human cognition. Cambridge, MA: Harvard
University Press.
Tomlin, R. S. (1986). Basic word order: Functional principles. London: Croom Helm.
Wray, A. (2002). Formulaic language and the lexicon. Cambridge, U.K.: Cambridge Uni
versity Press.
Page 28 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).
Notes:
(1.) Some researchers use the notion of construction also for simple lexemes such as car
or run (Goldberg, 1995; Croft & Cruse, 2004, §4), but in this article, I use the notion of
construction in a more restrictive way for structural patterns that comprise at least two
meaningful elements, and I use the notion of sign as a cover term for both simple lexemes
and constructions (see Diessel, 2015, p. 299).
Holger Diessel
Page 29 of 29
PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, LINGUISTICS (oxfordre.com/linguistics). (c) Oxford University
Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy
and Legal Notice).