Gestures are now viewed as an integral part of spoken language. But little attention has been
paid to the recipients' cognitive processes of integrating both gesture and speech. How do
people understand a speaker's gestures when inserted into gaps in the flow of speech? What
cognitive-semiotic mechanisms allow this integration to occur? And what linguistic and gestural
properties do people draw on when construing multimodal meaning? This book offers answers by
investigating multimodal utterances in which speech is replaced by gestures. Through
fine-grained cognitive-linguistic and cognitive-semiotic analyses of multimodal utterances
combined with naturalistic perception experiments six chapters explore gestures' potential to
realize grammatical notions of nouns and verbs and to integrate with speech by merging into
multimodal syntactic constructions. Analyses of speech-replacing gestures and a range of
related phenomena compel us to consider gestures as well as spoken and signed language as
manifestations of the same conceptual system. An overarching framework is proposed for studying
these different modalities together - a multimodal cognitive grammar.