Demonstration

Empowering Learners

Help non-native speakers learn and practice modern standard Arabic and reduce the influence of dialects.

Personalized Learning Experience

Provide tailored learning experiences with targeted feedback to boost confidence and encourage continued learning.

Strengthen Arabic Speech Research

Model better Arabic phonetic space, handling different accents, dialects, and speaking styles. Improve L2 and children speech models and explore acoustic modeling and augmentation techniques.

Papers The research behind Q-Voice.

Automatic Pronunciation Assessment - A Review

Paper Reference: Accepted EMNLP-2023 Findings

Paper here

L1-aware Multilingual Mispronunciation Detection Framework

Paper Reference: ICASSP 2024

Paper here

Multi-View Multi-Task Representation Learning for Mispronunciation Detection

Paper Reference: EL Kheir, Y., Chowdhury, S., Ali, A. (2023) Multi-View Multi-Task Representation Learning for Mispronunciation Detection. Proc. 9th Workshop on Speech and Language Technology in Education (SLaTE), 86-90, doi: 10.21437/SLaTE.2023-18

Paper here

MyVoice: Arabic Speech Resource Collaboration Platform

Paper Reference: Elshahawy, Y., El Kheir, Y., Chowdhury, S.A., Ali, A.M. (2023) MyVoice: Arabic Speech Resource Collaboration Platform. Proc. INTERSPEECH 2023, 3685-3686

Paper here

QVoice: Arabic Speech Pronunciation Learning Application

Paper Reference: El Kheir, Y., Khnaisser, F., Chowdhury, S.A., Mubarak, H., Afzal, S., Ali, A.M. (2023) QVoice: Arabic Speech Pronunciation Learning Application. Proc. INTERSPEECH 2023, 3677-3678

Paper here

The complementary roles of non-verbal cues for Robust Pronunciation Assessment

Paper Reference: arXiv preprint

Paper here

Speechblender: Speech augmentation framework for mispronunciation data generation

Paper Reference: EL Kheir, Y., Chowdhury, S., Ali, A., Mubarak, H., Afzal, S. (2023) SpeechBlender: Speech Augmentation Framework for Mispronunciation Data Generation. Proc. 9th Workshop on Speech and Language Technology in Education (SLaTE), 26-30, doi: 10.21437/SLaTE.2023-6

Paper here

Contact Feel free to connect.

Contact the team

Have something to say? We are here to help. Send an email.