Facial expressions are vital communicators of emotions, and it is in partial response to these expressions that we innately and accurately discern the emotional states of those around us. This paper identifies the activatable facial features that form the language of emotional expression in the face, and the set of emotions that each such feature tends to express. Finally, it is shown how the fault lattice perception theory [6] can be used to compute the emotion being registered on a face, given the configuration of the salient features. It is posited that the ability of a computer to make such interpretations would significantly enhance human-computer interaction.