Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
Specification of Multimodal Interactions in NCL This paper proposes an approach to integrate multimodal events--both user-generated, e.g., audio recognizer, motion sensors; and user-consumed, e.g., speech synthesizer, haptic synthesizer--into programming languages for the declarative specification of ...
Nov 30, 2020
Oct 27, 2015 · This paper proposes an approach to integrate multimodal events--both user-generated, e.g., audio recognizer, motion sensors; ...
This paper proposes an approach to integrate multimodal events--both user-generated, e.g., audio recognizer, motion sensors; and user-consumed, e.g., ...
This paper proposes an approach to integrate multimodal events–both user-generated, e.g., audio recognizer, motion sensors; and user-consumed, e.g., speech ...
Specification of Multimodal Interactions in NCL ... Add the full text or supplementary notes for the publication here using Markdown formatting. ... Published with ...
In this paper, we aim at studying how the NCL multimedia language could take advantage of those new recognition technologies. To do so, we revisit the model ...
The model behind NCL is revisited, named NCM (Nested Context Model), and extended with first-class concepts supporting multiuser and multimodal features, ...
ABSTRACT. This paper introduces two innovative tools for enhancing inter- active multimedia authoring using the Nested Context Language. (NCL): (i) a visual ...
This multimodal extension enables user interface designers to develop application interfaces for multiple access channels within a single development framework.
This work proposes an extension to the Brazilian Ginga-NCL DTV middleware to provide multimodal interaction to support new interaction events and multiple ...