Languages are powerful solutions to coordination problems: They provide stable, shared expectations about how the words we say correspond to the beliefs and intentions in our heads. Yet, language use in a variable and nonstationary social environment requires linguistic representations to be flexible: Old words acquire new ad hoc or partner-specific meanings on the fly. In this article, we introduce continual hierarchical adaptation through inference (CHAI), a hierarchical Bayesian theory of coordination and convention formation that aims to reconcile the long-standing tension between these two basic observations. We argue that the central computational problem of communication is not simply transmission, as in classical formulations, but continual learning and adaptation over multiple timescales. Partner-specific common ground quickly emerges from social inferences within dyadic interactions, while community-wide social conventions are stable priors that have been abstracted away from interactions with multiple partners. We present new empirical data alongside simulations showing how our model provides a computational foundation for several phenomena that have posed a challenge for previous accounts: (a) the convergence to more efficient referring expressions across repeated interaction with the same partner, (b) the gradual transfer of partner-specific common ground to strangers, and (c) the influence of communicative context on which conventions eventually form. (PsycInfo Database Record (c) 2023 APA, all rights reserved).