Please use this identifier to cite or link to this item: https://dspace.univ-ouargla.dz/jspui/handle/123456789/21507
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorAbdelmalik TALEB AHMED-
dc.contributor.advisorAIADI, Kamal Eddine-
dc.contributor.authorTidjani, Amina-
dc.date.accessioned2019-09-26T10:07:28Z-
dc.date.available2019-09-26T10:07:28Z-
dc.date.issued2019-
dc.identifier.urihttp://dspace.univ-ouargla.dz/jspui/handle/123456789/21507-
dc.description.abstractFace based automatic kinship veri cation is a novel challenging research problem in computer vision. It performs the automatic examining of the facial attributes and expecting whether two persons have a biological kin relation or not. The focus is on providing novel solutions for challenges of family veri cation from faces with an e cient system with the aim of providing enhancement to the accuracy of kinship veri cation. In our work, we analyzed the facial kinship veri cation systems in two modes the unimodal and multi-modal system. The feature extraction is a crucial step in the kinship recognition system. For this reason, we proposed two e cient feature learning extraction algorithms called discrete cosine transform network (DCTNet) and Context-Aware Local Binary Feature Learning (CA-LBFL). Various databases are used and extensive experiments are carried out in order to validate our proposed methods and developed methods. Besides, the experimental results demonstrated that the proposed methods achieved competitive results compared with other state-of-the-art.en_US
dc.language.isoenen_US
dc.publisher2019en_US
dc.relation.ispartofseries2019;-
dc.subjectBiometricsen_US
dc.subjectCA-LBFLen_US
dc.subjectKinshipen_US
dc.subjectVeri cationen_US
dc.subjectDCTNeten_US
dc.titleFace based Automatic Kinship Veri cationen_US
dc.typeThesisen_US
Appears in Collections:Département d'Electronique et des Télécommunications - Doctorat

Files in This Item:
File Description SizeFormat 
Tidjani_Amina.pdf873,78 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.