Artificial Intelligence (AI) is of national strategic importance due to the impressive results of deep learning algorithms in different domains such as natural language processing (NLP), communications, medicine, law, political analytics, and military with a wide range of applications.
During the last decade, large efforts have been devoted to end-to-end spoken language understanding (SLU) systems motivated by the feasibility of popular applications such as personal assistants and conversational systems. Superior results have been observed with these systems in automatic speech recognition (ASR) with architectures based on a complex number algebra, called quaternions, requiring fewer processing time and parameters to be estimated compared to models based just on real numbers.
Reduction of model parameters makes it possible to effectively train neural architectures with limited amounts of data. Often difficult to obtain for concepts and conversation semantic contexts in specific unconventional domains this problems of inherently sequential nature preclude parallelization within training examples. Furthermore, error analysis has shown the importance of leveraging on prior domain knowledge for semantic interpretation. This project will investigate novel attention models that use semantics to focus on specific contextual information for improving concept classification.