The LMU Germanic Linguistics Section and the Berkeley Linguistics Department are pleased to announce a Workshop on Language Change for the Worse, funded through LMU-UCB Research in the Humanities. This Workshop is organized by Dankmar Enke, Guido Seiler and Thilo Weber (LMU Munich) and by Larry Hyman and Johanna Nichols (UC Berkeley).
Our goal is to bring together linguists interested in language change irrespective of their theoretical framework to present – or refute – instances of change for the worse – including cases of complexification – and to explore some of the implications of the (non-)existence of this class of phenomena for current theories of language change. For this workshop, oral presentations are organized around the following specific themes: (i) Phonological change for the worse, (ii) Morphosyntactic change for the worse, (iii) Semantic change for the worse, (iv) Focus on complexification and (v) Modelling change for the worse.
The workshop consists of invited talks, presentations and discussion sessions. Invited speakers include:
The idea that over time, languages improve (or worsen) globally has largely been given up. Nonetheless, many theories of language change hold that at least on a local level, changes are indeed driven by a need for improvement: For instance, Naturalness Theory assumes that structural features of language can be evaluated in terms of their 'naturalness'/'markedness', and it predicts that over time, 'natural'/'unmarked' structures will win out over 'unnatural'/'marked' ones. Another example is Vennemann's (1988, 1993) preference theory, which holds that every syllable structure change will lead to an improvement of syllable structure. Haspelmath (1999), finally, relates functionalist/usage- based approaches to language change with an evolutionary perspective, interpreting local changes as functional adaptations to the needs of language users.
Changes for the worse, on the other hand, are often considered a mere side product of a change for the better in some other area: Vennemann, for example, argues that if a change worsens syllable structure (as is the case e.g. with syncope and apocope), it is not a genuine syllable-structural change but rather follows from a change on some other parameter (in the case of vowel deletion: preferred word size). The idea that the principles or criteria that define what is ‘good’/‘better’ or ‘bad’/‘worse’ may be in conflict with each other is also present in naturalness and, perhaps most prominently, in the framework of optimality theory (OT). As regards naturalness, what may be natural in terms of phonology (e.g. loss of a word-final unstressed vowel), for instance, may well be unnatural in terms of morphology (e.g. if, along with the vowel, an entire affix is lost, which may lead to syncretism). As regards OT, grammar is explicitly conceived of as a set of competing, violable and hierarchically ordered constraints. If an output violates a given constraint A, this violation will always have to be justified by the fulfilment of a higher-ranking constraint B. From a diachronic perspective, then, language change is nothing but a re-ranking of constraints. Constraint reranking, too, expresses the idea that ‘worsening’ outputs are a side effect of some other local improvement, i.e. promotion of some other output constraint.
Our workshop aims to explore phenomena of language change which seem to run counter to the hypothesis outlined above: changes for the worse that do not readily follow from an improvement in some other area of the language system. A potential example of a development away from what appears to be functionally useful might be the emergence of the well-known verb-second constraint of most modern Germanic languages: Even though, from a pragmatic perspective, it often seems desirable to front more than one constituent (e.g. when they convey old information – an option that apparently was exploited in Old High German), modern German requires that the finite verb be preceded by no more than one constituent. The functional gain of this rule, however, still remains obscure. An example of a diachronic change toward the typologically marked can be found in the morphological system of the Alemannic (southern German) dialect of Visperterminen: This dialect has retained a rich system of case distinctions in the plural while having drastically reduced case marking in the singular, thus violating Greenberg’s implicational universal that morphological distinctions found in the marked value of a given category will also be found in the unmarked value. At the same time, it is hard to find a higher-ranking principle that would seem to justify this violation.
An important aspect of the question of language worsening (and improvement) is the debate on complexification (and, conversely, 'simplification'): This idea has more than dubious roots, namely the 19th century notion that some (mostly European) languages are more advanced than others. It is a small wonder, then, that 20th century linguistics more or less reached a consensus that human languages are constant and very much alike in their overall degree of complexity. However, this basic tenet has recently been called into question and has received considerable attention ever since (e.g. Dahl 2004; Miestamo et al. 2008; Garrett 2008; Albright 2008; Sampson et al. 2009; Trudgill 2011; Newmeyer & Preston 2014). The leading idea is that complexity is not a global property of grammatical systems, but instead stems from the interaction of its different parts. Crucially, increase of complexity seems to violate the markedness principles postulated by Naturalness or Optimality Theories. Typical questions which have been raised in this context and which are also immediately relevant to the workshop topic include the following: How can complexity (and the processes associated with it) be measured? Is complexification in one sub-system (e.g. morphology) always balanced out by simplification in another sub-system (e.g. syntax)? What are the determinants of (de-)complexification.
Copyright © LMU 2017