Please use this identifier to cite or link to this item:
https://dspace.univ-ouargla.dz/jspui/handle/123456789/35019
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | BENKADDOUR, Mohammed Kamel | - |
dc.contributor.author | Abazi, Yahia | - |
dc.contributor.author | Aziza Akram, Zakaria | - |
dc.date.accessioned | 2023-11-14T14:38:06Z | - |
dc.date.available | 2023-11-14T14:38:06Z | - |
dc.date.issued | 2023 | - |
dc.identifier.uri | https://dspace.univ-ouargla.dz/jspui/handle/123456789/35019 | - |
dc.description.abstract | The recognition of Arabic sign language (ArSL) plays a crucial role in removing communication barriers between deaf-mute people and non-sign language speakers. In this study, we propose a dynamic model for Arabic sign language recognition using deep learning (DL) techniques. Our model utilizes a convolutional neural network (CNN) architecture to extract meaningful features from sign language (SL) images, for accurate classification of different signs. We also describe the dataset used for training and evaluating the model, which includes a collection of Arabic sign language images. After extensive experimentation and evaluation, the results prove the effectiveness of the proposed methods, achieving high recognition accuracy across multiple ArSL gestures. | en_US |
dc.language.iso | en | en_US |
dc.publisher | UNIVERSITY OF KASDI MERBAH OUARGLA | en_US |
dc.subject | Sign language | en_US |
dc.subject | Arabic sign language | en_US |
dc.subject | Deep learning | en_US |
dc.subject | Convolutional neural network | en_US |
dc.subject | Recognition | en_US |
dc.subject | Classification | en_US |
dc.title | Hand gesture and sign language recognition based on deep learning | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | Département d'informatique et technologie de l'information - Master |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ABAZI-AZIZA AKRAM.pdf | 2,12 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.