Semantics (from
Greek sēmantiká, neuter plural of
sēmantikós) is the study of
meaning. It focuses on the relation between
signifiers, such as
words,
phrases,
signs and
symbols, and what they stand for, their
denotata.
Linguistic semantics is the study of meaning that is used by humans to express themselves through language. Other forms of semantics include the semantics of programming languages, formal logics, and semiotics.
The word "semantics" itself denotes a range of ideas, from the popular to the highly technical. It is often used in ordinary language to denote a problem of understanding that comes down to word selection or connotation. This problem of understanding has been the subject of many formal inquiries, over a long period of time, most notably in the field of formal semantics. In linguistics, it is the study of interpretation of signs or symbols as used by agents or communities within particular circumstances and contexts.
The formal study of semantics intersects with many other fields of inquiry, including lexicology, syntax, pragmatics, etymology and others, although semantics is a well-defined field in its own right, often with synthetic properties. In philosophy of language, semantics and reference are related fields. Further related fields include philology, communication, and semiotics. The formal study of semantics is therefore complex.
Semantics contrasts with syntax, the study of the combinatorics of units of a language (without reference to their meaning), and pragmatics, the study of the relationships between the symbols of a language, their meaning, and the users of the language.
In international scientific vocabulary semantics is also called semasiology.
Linguistics
In
linguistics,
semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words, phrases, sentences, and larger units of
discourse (referred to as
texts).
The basic area of study is the meaning of
signs, and the study of relations between different linguistic units:
homonymy,
synonymy,
antonymy,
polysemy,
paronyms,
hypernymy,
hyponymy,
meronymy,
metonymy,
holonymy, linguistic
compounds. A key concern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units of meaning.
Traditionally, semantics has included the study of
sense and denotative
reference,
truth conditions, argument structure,
thematic roles,
discourse analysis, and the linkage of all of these to syntax.
Montague grammar
In the late 1960s,
Richard Montague proposed a system for defining semantic entries in the lexicon in terms of the
lambda calculus. In these terms, the syntactic
parse of the sentence
John ate every bagel would consist of a subject (
John) and a predicate (
ate every bagel); Montague showed that the meaning of the sentence as a whole could be decomposed into the meanings of its parts and relatively few rules of combination. The logical predicate thus obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of
Tarskiian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to the
language of thought hypothesis from the 1970s.
Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as:
situation semantics (1980s): truth-values are incomplete, they get assigned based on context
generative lexicon (1990s): categories (types) are incomplete, and get assigned based on context
Dynamic turn in semantics
In
Chomskian linguistics there was no mechanism for the learning of semantic relations, and the
nativist view considered all semantic notions as inborn. Thus, even novel concepts were proposed to have been dormant in some sense. This view was also thought unable to address many issues such as
metaphor or associative meanings, and
semantic change, where meanings within a linguistic community change over time, and
qualia or subjective experience. Another issue not addressed by the nativist model was how perceptual cues are combined in thought, e.g. in
mental rotation.
This view of semantics, as an innate finite meaning inherent in a lexical unit that can be composed to generate meanings for larger chunks of discourse, is now being fiercely debated in the emerging domain of cognitive linguistics
and also in the non-Fodorian camp in Philosophy of Language.
The challenge is motivated by:
factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this x, him, last week). In these situations "context" serves as the input, but the interpreted utterance also modifies the context, so it is also the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences is viewed as context change potentials instead of propositions.
factors external to language, i.e. language is not a set of labels stuck on things, but "a toolbox, the importance of whose elements lie in the way they function rather than their attachments to things." However, the colours implied in phrases such as "red wine" (very dark), and "red hair" (coppery), or "red soil", or "red skin" are very different. Indeed, these colours by themselves would not be called "red" by native speakers. These instances are contrastive, so "red wine" is so called only in comparison with the other kind of wine (which also is not "white" for the same reasons). This view goes back to
de Saussure:
and may go back to earlier Indian views on language, especially the Nyaya view of words as indicators and not carriers of meaning.
An attempt to defend a system based on propositional meaning for semantic underspecification can be found in the Generative Lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into the lexicon. Thus meanings are generated on the fly based on finite context.
Prototype theory
Another set of concepts related to fuzziness in semantics is based on
prototypes. The work of
Eleanor Rosch
in the 1970s led to a view that
natural categories are not characterizable in terms of
necessary and sufficient
conditions, but are graded (fuzzy at their boundaries) and inconsistent as to
the status of their constituent members.
Systems of categories are not objectively "out there" in the world but are
rooted in people's experience. These categories evolve as learned concepts
of the world – meaning is not an objective truth, but a
subjective construct, learned from experience, and language arises
out of the "grounding of our
conceptual systems in shared embodiment and bodily experience".
A corollary of this is that the conceptual categories
(i.e. the lexicon) will not be identical for
different cultures, or indeed, for every individual in the same culture. This
leads to another debate (see the Sapir–Whorf hypothesis or Eskimo words for snow).
Theories in semantics
Model theoretic semantics
Originates from Montague's work (see above). A highly formalized theory of natural language semantics in which expressions are assigned denotations (meanings) such as individuals, truth values, or functions from one of these to another. The truth of a sentence, and more interestingly, its logical relation to other sentences, is then evaluated relative to a model.
Formal (of truth-conditional) semantics
Pioneered by the philosopher
Donald Davidson, another formalized theory, which aims to associate each natural language sentence with a meta-language description of the conditions under which it is true, for example: `Snow is white' is true if and only if snow is white. The challenge is to arrive at the truth conditions for any sentences from fixed meanings assigned to the individual words and fixed rules for how to combine them. In practice, truth-conditional semantics is similar to model-theoretic semantics; conceptually, however, they differ in that truth-conditional semantics seeks to connect language with statements about the real world (in the form of meta-language statements), rather than with abstract models.
Lexical & conceptual semantics
This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntactic properties of phrases reflect the meanings of the words that head them. With this theory, linguists can better deal with the fact that subtle differences in word meaning correlate with other differences in the syntactic structure that the word appears in. These small parts that make up the internal structure of words are referred to as semantic primitives. Therefore, a distinction between degrees of participation as well as modes of participation are made.
Computer science
In
computer science, the term
semantics refers to the meaning of languages, as opposed to their form (syntax). Additionally, the term
semantic is applied to certain types of data structures specifically designed and used for representing information content.
Programming languages
The semantics of
programming languages and other languages is an important issue and area of study in computer science. Like the
syntax of a language, its semantics can be defined exactly.
For instance, the following statements use different syntaxes, but cause the same instructions to be executed:
{|
|-
|style="font-family: monospace"|x += y
||(C, Java, Perl, Python, Ruby, PHP, etc.)
|-
|style="font-family: monospace"|x := x + y
||(Pascal)
|-
|style="font-family: monospace"|ADD x, y
||(Intel 8086 Assembly Language)
|-
|style="font-family: monospace"|LET X = X + Y
||(early BASIC)
|-
|style="font-family: monospace"|x = x + y
||(most BASIC dialects, Fortran)
|-
|style="font-family: monospace"|ADD Y TO X GIVING X
||(COBOL)
|-
|style="font-family: monospace"|(incf x y)
||(Common Lisp)
|}
Generally these operations would all perform an arithmetical addition of 'y' to 'x' and store the result in a variable called 'x'.
Various ways have been developed to describe the semantics of programming languages formally, building on mathematical logic:
Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced.
Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained.
Axiomatic semantics: Specific properties of the effect of executing the constructs are expressed as assertions. Thus there may be aspects of the executions that are ignored.
Semantic models
Terms such as "
semantic network" and "
semantic data model" are used to describe particular types of data models characterized by the use of
directed graphs in which the vertices denote concepts or entities in the world, and the arcs denote relationships between them.
The Semantic Web refers to the extension of the World Wide Web through the embedding of additional semantic metadata, using semantic data modelling techniques such as RDF and OWL.
Psychology
In
psychology,
semantic memory is memory for meaning – in other words, the aspect of memory that preserves only the
gist, the general significance, of remembered experience – while
episodic memory is memory for the ephemeral details – the individual features, or the unique particulars of experience. Word meaning is measured by the company they keep, i.e. the relationships among words themselves in a
semantic network. The memories may be transferred intergenerationally or isolated in a single generation due to a cultural disruption. Different generations may have different experiences at similar points in their own time-lines. This may then create a vertically heterogeneous semantic net for certain words in an otherwise homogeneous culture. In a network created by people analyzing their understanding of the word (such as
Wordnet) the links and decomposition structures of the network are few in number and kind, and include "part of", "kind of", and similar links. In automated
ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words:
latent semantic indexing and
support vector machines as well as
natural language processing,
neural networks and
predicate calculus techniques.
See also
Linguistics and semiotics
Logic and mathematics
Computer science
References
External links
semanticsarchive.net
Teaching page for A-level semantics
Chomsky, Noam; On Referring, Harvard University, 30 October 2007 (video)
Jackendoff, Ray; Conceptual Semantics, Harvard University, 13 November 2007(video)
Semantic Systems Biology
Semantics: an interview with Jerry Fodor, ReVEL, vol. 5, n. 8, 2007
Category:Grammar
Category:Semantics
Category:Social philosophy
Category:Greek loanwords
Category:Concepts in logic