Logic in Linguistics

← Back

Introduction

The relationship between logic and linguistics is profound and multifaceted. Formal logic provides tools for analyzing the structure and meaning of natural language, while natural language phenomena challenge and extend formal logical systems.

From formal semantics to computational linguistics, logical methods illuminate how language conveys meaning, how sentences combine to form complex thoughts, and how we can build computational systems that understand language.

This guide explores the application of logic to linguistic analysis, from truth-conditional semantics to natural language processing, showing how logical formalism helps us understand the systematic nature of human language.

Formal Semantics

Formal semantics uses logic and mathematics to model how linguistic expressions get their meanings. The goal is to provide precise, compositional accounts of meaning that explain how sentence meaning arises from word meaning and syntactic structure.

Different semantic frameworks make different assumptions about the nature of meaning, but all rely fundamentally on logical tools to make meaning relationships explicit and testable.

Truth-Conditional Semantics

The meaning of a sentence is identified with its truth conditions—the conditions under which it would be true. 'Snow is white' means that snow is white. Logical semantics provides a systematic way to compute truth conditions.

Compositionality (Frege's Principle)

The meaning of a complex expression is determined by the meanings of its parts and how they're combined. This principle enables finite linguistic knowledge to produce infinite sentences—a core property of human language.

Model-Theoretic Semantics

Meanings are defined relative to models—mathematical structures specifying what exists and what properties objects have. A sentence is true in a model if the model satisfies its truth conditions.

Possible Worlds Semantics

Extends model-theoretic semantics to handle modals, conditionals, and intensional contexts. The meaning of 'It might rain' involves quantifying over possible worlds where it rains.

Situation Semantics

Instead of evaluating sentences relative to entire worlds, uses partial situations—parts of reality. Addresses problems with possible worlds semantics for certain linguistic phenomena.

Dynamic Semantics

Treats meaning as context-change potential rather than truth conditions. The meaning of 'A man walks in. He sits down' involves how 'a man' introduces a discourse referent accessible to 'he'.

Quantification in Natural Language

Natural language has rich quantificational structure that extends beyond simple ∀ and ∃. Generalized quantifier theory provides logical tools to analyze this complexity.

Universal Quantifiers

Words like 'all', 'every', 'each' express universal quantification but with subtle differences in meaning and syntactic distribution. 'Every student passed' ≈ ∀x(student(x) → passed(x)).

Existential Quantifiers

'Some', 'a', 'several' express existential quantification. 'A student passed' ≈ ∃x(student(x) ∧ passed(x)). Note 'some' carries scalar implicature (not all).

Generalized Quantifiers

'Most', 'few', 'many', 'several' don't reduce to ∀ or ∃. Generalized quantifier theory treats them as relations between sets: 'Most students passed' means |students ∩ passed| > |students ∩ ¬passed|.

Quantifier Scope Ambiguity

'Everyone loves someone' is ambiguous: ∀x∃y(loves(x,y)) ('everyone has some beloved') vs ∃y∀x(loves(x,y)) ('there's someone everyone loves'). Scope determines logical structure.

Donkey Sentences and Anaphora

'Every farmer who owns a donkey beats it' poses challenges. What does 'it' refer to? What's the scope of 'a donkey'? Dynamic semantics and discourse representation theory address these puzzles.

Logical Form

Logical form (LF) is the abstract syntactic structure that determines semantic interpretation. It's often distinct from surface syntactic structure.

Extracting logical form from natural language sentences reveals hidden complexity and explains semantic properties like ambiguity, entailment, and anomaly.

Deep Structure vs Surface Structure

The surface form 'What did John eat?' and deep/logical form where 'what' originates as object of 'eat'. Movement operations map between surface and logical form.

Lambda Calculus and Variable Binding

Lambda abstraction (λx.P(x)) creates functions from formulas. Essential for compositional semantics: 'walks' might denote λx.walk(x), which combines with 'John' to yield walk(john).

Type Theory (Montague Grammar)

Richard Montague used typed lambda calculus to model compositionality. Every expression has a type (e for entities, t for truth values, etc.), and combination respects type constraints.

Categorial Grammar

Syntactic categories are logical types. A transitive verb has type (NP\S)/NP—it combines with object NP on right and subject NP on left to form sentence S. Syntax mirrors semantics.

Presupposition and Implicature

Not all aspects of meaning are truth-conditional. Presuppositions and implicatures add layers of meaning that formal semantics must account for using logical tools.

Semantic Presupposition

'The king of France is bald' presupposes France has a king. Both the sentence and its negation carry this presupposition—it survives negation and questioning.

Pragmatic Presupposition

Presuppositions depend on context and speaker assumptions. 'Even John came' presupposes others came and John was unlikely to come. Cancellable in certain contexts.

Presupposition Projection

How presuppositions of parts project to presuppositions of wholes. 'If France has a king, the king of France is bald' inherits presupposition differently than simple sentence.

Gricean Implicature

H.P. Grice distinguished what's said (truth-conditional meaning) from what's implicated (conversational implicature). 'Some students passed' implicates (not all passed) by maxim of quantity.

Scalar Implicature

Use of weaker term on a scale (<all, most, many, some, none>) implicates the negation of stronger alternatives. Formal pragmatics uses logic to model these inferences.

Modality in Language

Natural languages express necessity, possibility, obligation, and permission through modal verbs and other devices. Modal logic provides tools to analyze modal meaning.

Epistemic Modals

'Must', 'might', 'could', 'may' express speaker's epistemic state. 'It must be raining' means the speaker infers rain from evidence. Analyzed using modal logic and possible worlds.

Deontic Modals

'Should', 'ought', 'may', 'must' express obligation and permission. 'You should leave' imposes obligation. Deontic logic models these normative meanings.

Dynamic Modals

'Can', 'able to' express ability or dispositional properties. 'John can swim' attributes swimming ability—a different modal flavor than epistemic or deontic.

Evidentiality

Some languages grammatically mark information source (direct observation, inference, hearsay). Epistemic logic extended with evidential operators models this semantic category.

Modal Base and Ordering Source

Kratzer's analysis: modals quantify over possible worlds restricted by modal base (contextually relevant worlds) and ordered by ordering source (what's ideal/normal). Provides unified analysis of modal varieties.

Negation

Negation in natural language is more complex than logical NOT. Scope, polarity, and pragmatic effects create rich patterns requiring sophisticated logical analysis.

Sentential vs Constituent Negation

'John didn't leave' (sentential negation: ¬leave(john)) vs 'Not John left' (constituent negation: focuses on subject). Logical scope and focus determine interpretation.

Negative Polarity Items

Items like 'any', 'ever', 'yet' require downward-entailing contexts. 'I didn't see anyone' is fine; *'I saw anyone' is bad. Requires logical characterization of licensing environments.

Double Negation and Negative Concord

In logic, ¬¬P = P. Some languages (French, Spanish) use negative concord where multiple negatives express single negation: 'Je ne vois personne' (I don't see nobody = I see nobody).

Metalinguistic Negation

'I didn't trap two rabbits; I trapped three' negates the implicature, not the truth-conditional content. Shows negation can target non-truth-conditional aspects of meaning.

Computational Applications

Formal logic enables computational processing of natural language. From semantic parsing to question answering, logical representations bridge linguistic analysis and automated reasoning.

Modern NLP increasingly uses logic-based methods alongside statistical approaches, especially for tasks requiring precise reasoning and compositional understanding.

Natural Language Processing

Computational linguistics uses logical formalisms to represent meaning, enabling machines to understand and generate language. Semantic parsing converts sentences to logical forms for automated reasoning.

Semantic Parsing

Automatically converting sentences to formal semantic representations (first-order logic, lambda calculus, SQL). Enables question answering, database querying, and semantic search.

Question Answering Systems

Systems like IBM Watson use logical inference over knowledge bases. Questions are parsed to logical queries, answered by reasoning over logical representations of knowledge.

Textual Entailment

Determining if text T entails hypothesis H. 'John bought a car' entails 'John owns a vehicle'. Requires logical inference over semantic representations.

Grammar Formalisms

  • Context-Free Grammars: Classical formalism with logical foundations in formal language theory
  • Type-Logical Grammar: Uses typed lambda calculus; syntax-semantics correspondence via Curry-Howard isomorphism
  • HPSG (Head-Driven Phrase Structure Grammar): Feature structures with logical constraints
  • Minimalist Syntax: Derives logical form through syntactic operations like Merge and Move