## Lecture 10.1 Communication

• Communication is a goal: to transfer information from source
to destination.
• If M and F are independent, $H(M|F) = H(M)$.
• If F deterministically map to M, $H(M|F) = 0$.

### Key terms

communication, information, source, destination, computational level goal, necessary subgoal

source, probability distribution over meaning, transmitter, encoder, function, meaning, form, receiver, decoder, noise source, reference resolution

information theory, information content, surprise, entropy, uncertainty, outcome, conditional entropy, mutual information, deterministic, KL divergence

### Equation or Formula or method collection

• Surprisal: $I(x)=-log_xP(x)$
• Entropy: $H(X)=\sum_x P(x)I(x)$
• Mutual information: the information shared between two variables $I(M; F) = H(M) - H(M|F)$
• Kullback-Leibler (KL) Divergence: The information gained by using distribution P compared to distribution Q $KL(P||Q) = -\sum_xP(x)log\frac{Q(x)}{P(x)}$

## Lecture 10.2 Computational Psycholinguistics

### Key terms

tool, frequent, homophone, word sense, information processing effect, reading time,

• Language as a communication system seems optimized for efficiency.
• Highly needed words are:
• short in length
• close to each other in sentences
• versatile
• Homophones: Same sounds but different meanings
• Word Sense: A single meaning of a word

## Lecture 11 Linguistic Diversity & Efficient Communication

• Sapir-Whorf Hypothesis
• Languages carve up the world in different ways.
• Does this influence our conceptual system?
• You cannot minimize two superlatives without specifying how they trade-off.
• Objective Function
• Minimize Description Length (Algorithmic Complexity)
• Minimize Communicative Cost: $KL[M||\hat{M}]$
• There are a variety of fields that study linguistic diversity.
• Sociolinguistics
• Linguistic Anthropology
• Linguistic Typology
• Semantic typology studies how different languages carve up the world.
• Recently, information theory has helped characterize and explain the diversity of semantic systems as communicative efficiency trade-offs.
• This is in line with the soft take of the Sapir-Whorf Hypothesis.

### Key terms

linguistic diversity, Sapir-Whorf Hypothesis, World Color Survey, Munsell Color Chart, basic color term, efficiency trade-off, principle of least effort, force of unification, force of diversification, communicative efficiency trade-offs, the information bottleneck, theoretical limit, observation, hypothetical variant, kinship system, objective function, description length, communication cost,

### Equation or Formula or method collection

• The information bottleneck, $F_β[q(w|m)]= I_q(M; W )-βKL[M||\hat{M}]= I_q(M; W )-βI_q(W ;U)$

## Lecture 12.1 Vector Space Semantics: Intuition

### Key terms

context vector, context words, target words, distributional similarity, syntactic categories, context window, wording meaning, LSA, dimensionality reduction step,

## Lecture 12.2 Vector Space Semantics: Implementation

### Key terms

learn, word embedding, neural network, guess a word from its context, representation of the context, representation of the word, words within a context window, count vector, Word2V Model, the input and output layers are one-hot encoded, each word is represented as a vector of size V, the number of words in the vocabulary, representation learning

• after training, the hidden layer $h_i$ for target word $y_i$ can be used as its context vector.

## Lecture 12.3 Vector Space Semantics: Evaluation

### Key terms

model evaluation, semantic priming, lexical decision experiment,